sha
stringlengths
40
40
text
stringlengths
1
13.4M
id
stringlengths
2
117
tags
sequencelengths
1
7.91k
created_at
stringlengths
25
25
metadata
stringlengths
2
875k
last_modified
stringlengths
25
25
arxiv
sequencelengths
0
25
languages
sequencelengths
0
7.91k
tags_str
stringlengths
17
159k
text_str
stringlengths
1
447k
text_lists
sequencelengths
0
352
processed_texts
sequencelengths
1
353
d9b7bf01bbe7b2685ff4d24d1edd496f6469bebc
# Dataset Card for "mmlu-virology-neg-prepend-verbal" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-virology-neg-prepend-verbal
[ "region:us" ]
2024-01-10T23:41:52+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "negate_openai_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "neg_question", "dtype": "string"}, {"name": "fewshot_context", "dtype": "string"}, {"name": "ori_prompt", "dtype": "string"}, {"name": "neg_prompt", "dtype": "string"}, {"name": "fewshot_context_neg", "dtype": "string"}, {"name": "fewshot_context_ori", "dtype": "string"}], "splits": [{"name": "dev", "num_bytes": 6318, "num_examples": 5}, {"name": "test", "num_bytes": 1166677, "num_examples": 166}], "download_size": 160485, "dataset_size": 1172995}}
2024-01-11T07:09:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mmlu-virology-neg-prepend-verbal" More Information needed
[ "# Dataset Card for \"mmlu-virology-neg-prepend-verbal\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mmlu-virology-neg-prepend-verbal\"\n\nMore Information needed" ]
08fd766eac71b29c27de3d1c392168eab299a5bf
# Dataset Card for "eval-gauntlet" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
eitanturok/eval-gauntlet
[ "region:us" ]
2024-01-10T23:50:57+00:00
{"dataset_info": [{"config_name": "gsm8k", "features": [{"name": "context", "dtype": "string"}, {"name": "chain_of_thought", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 711094, "num_examples": 1319}], "download_size": 417547, "dataset_size": 711094}, {"config_name": "jeopardy", "features": [{"name": "category", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "continuation", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 304469, "num_examples": 2181}], "download_size": 186182, "dataset_size": 304469}, {"config_name": "trivia_qa", "features": [{"name": "context", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "aliases", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 4519839, "num_examples": 11313}], "download_size": 2659436, "dataset_size": 4519839}], "configs": [{"config_name": "gsm8k", "data_files": [{"split": "train", "path": "gsm8k/train-*"}]}, {"config_name": "jeopardy", "data_files": [{"split": "train", "path": "jeopardy/train-*"}]}, {"config_name": "trivia_qa", "data_files": [{"split": "train", "path": "trivia_qa/train-*"}]}]}
2024-01-12T18:36:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for "eval-gauntlet" More Information needed
[ "# Dataset Card for \"eval-gauntlet\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"eval-gauntlet\"\n\nMore Information needed" ]
3727c077c628314183801353b3488c92b879b2bb
# Mixtral Magicoder: Source Code Is All You Need on various programming languages We sampled programming languages from https://huggingface.co/datasets/bigcode/the-stack-dedup and pushed to https://huggingface.co/datasets/malaysia-ai/starcoderdata-sample After that, we use [Magicoder: Source Code Is All You Need on various programming languages](https://github.com/ise-uiuc/magicoder) template, we target at least 10k rows for each programming languages. 1. C++, 10747 rows 2. C#, 10193 rows 3. CUDA, 13843 rows 4. Dockerfile, 13286 rows 5. Go, 10143 rows 6. Java, 11221 rows 7. JavaScript, 11758 rows 8. Kotlin, 12790 rows 9. PHP, 10176 rows 10. Python, other than `pandas` and `sklearn` and `matplotlib` and `plotly`, 10925 rows 11. Python, must have `pandas` or `sklearn` or `matplotlib` or `plotly`, focused on data analytics, 53959 rows 12. Ruby, 10201 rows 13. Rust, 10271 rows 14. Scala, 10017 rows 15. Shell, 10848 rows 16. SQL, 27668 rows 17. Swift, 10187 rows 18. TypeScript, 14248 rows Source code at https://github.com/mesolitica/malaysian-dataset/tree/master/chatbot/mixtral-magicoder ## precaution 1. There is no validation for the output generated. 2. Always filter short answers. ## Filtered version 1. Dropped short answers. 2. Dropped contain `code snippet`. Uploaded at [postfilter.jsonl](postfilter.jsonl). ## Infrastructure specification 1. 5x of 4x A100s, NC96ads A100 v4, spot instance, total run is ~48 hours, 48 * 1.954 (US East, https://instances.vantage.sh/azure/vm/nc96ads-v4) * 5 ~= 376 USD. 2. HuggingFace Text Inference Engine.
mesolitica/mixtral-magicoder
[ "task_categories:conversational", "language:en", "language:ms", "license:mit", "region:us" ]
2024-01-11T00:04:30+00:00
{"language": ["en", "ms"], "license": "mit", "task_categories": ["conversational"]}
2024-01-23T05:42:03+00:00
[]
[ "en", "ms" ]
TAGS #task_categories-conversational #language-English #language-Malay (macrolanguage) #license-mit #region-us
# Mixtral Magicoder: Source Code Is All You Need on various programming languages We sampled programming languages from URL and pushed to URL After that, we use Magicoder: Source Code Is All You Need on various programming languages template, we target at least 10k rows for each programming languages. 1. C++, 10747 rows 2. C#, 10193 rows 3. CUDA, 13843 rows 4. Dockerfile, 13286 rows 5. Go, 10143 rows 6. Java, 11221 rows 7. JavaScript, 11758 rows 8. Kotlin, 12790 rows 9. PHP, 10176 rows 10. Python, other than 'pandas' and 'sklearn' and 'matplotlib' and 'plotly', 10925 rows 11. Python, must have 'pandas' or 'sklearn' or 'matplotlib' or 'plotly', focused on data analytics, 53959 rows 12. Ruby, 10201 rows 13. Rust, 10271 rows 14. Scala, 10017 rows 15. Shell, 10848 rows 16. SQL, 27668 rows 17. Swift, 10187 rows 18. TypeScript, 14248 rows Source code at URL ## precaution 1. There is no validation for the output generated. 2. Always filter short answers. ## Filtered version 1. Dropped short answers. 2. Dropped contain 'code snippet'. Uploaded at URL. ## Infrastructure specification 1. 5x of 4x A100s, NC96ads A100 v4, spot instance, total run is ~48 hours, 48 * 1.954 (US East, URL * 5 ~= 376 USD. 2. HuggingFace Text Inference Engine.
[ "# Mixtral Magicoder: Source Code Is All You Need on various programming languages\n\nWe sampled programming languages from URL and pushed to URL\n\nAfter that, we use Magicoder: Source Code Is All You Need on various programming languages template, we target at least 10k rows for each programming languages.\n\n1. C++, 10747 rows\n2. C#, 10193 rows\n3. CUDA, 13843 rows\n4. Dockerfile, 13286 rows\n5. Go, 10143 rows\n6. Java, 11221 rows\n7. JavaScript, 11758 rows\n8. Kotlin, 12790 rows\n9. PHP, 10176 rows\n10. Python, other than 'pandas' and 'sklearn' and 'matplotlib' and 'plotly', 10925 rows\n11. Python, must have 'pandas' or 'sklearn' or 'matplotlib' or 'plotly', focused on data analytics, 53959 rows\n12. Ruby, 10201 rows\n13. Rust, 10271 rows\n14. Scala, 10017 rows\n15. Shell, 10848 rows\n16. SQL, 27668 rows\n17. Swift, 10187 rows\n18. TypeScript, 14248 rows\n\nSource code at URL", "## precaution\n\n1. There is no validation for the output generated.\n2. Always filter short answers.", "## Filtered version\n\n1. Dropped short answers.\n2. Dropped contain 'code snippet'.\n\nUploaded at URL.", "## Infrastructure specification\n\n1. 5x of 4x A100s, NC96ads A100 v4, spot instance, total run is ~48 hours, 48 * 1.954 (US East, URL * 5 ~= 376 USD.\n2. HuggingFace Text Inference Engine." ]
[ "TAGS\n#task_categories-conversational #language-English #language-Malay (macrolanguage) #license-mit #region-us \n", "# Mixtral Magicoder: Source Code Is All You Need on various programming languages\n\nWe sampled programming languages from URL and pushed to URL\n\nAfter that, we use Magicoder: Source Code Is All You Need on various programming languages template, we target at least 10k rows for each programming languages.\n\n1. C++, 10747 rows\n2. C#, 10193 rows\n3. CUDA, 13843 rows\n4. Dockerfile, 13286 rows\n5. Go, 10143 rows\n6. Java, 11221 rows\n7. JavaScript, 11758 rows\n8. Kotlin, 12790 rows\n9. PHP, 10176 rows\n10. Python, other than 'pandas' and 'sklearn' and 'matplotlib' and 'plotly', 10925 rows\n11. Python, must have 'pandas' or 'sklearn' or 'matplotlib' or 'plotly', focused on data analytics, 53959 rows\n12. Ruby, 10201 rows\n13. Rust, 10271 rows\n14. Scala, 10017 rows\n15. Shell, 10848 rows\n16. SQL, 27668 rows\n17. Swift, 10187 rows\n18. TypeScript, 14248 rows\n\nSource code at URL", "## precaution\n\n1. There is no validation for the output generated.\n2. Always filter short answers.", "## Filtered version\n\n1. Dropped short answers.\n2. Dropped contain 'code snippet'.\n\nUploaded at URL.", "## Infrastructure specification\n\n1. 5x of 4x A100s, NC96ads A100 v4, spot instance, total run is ~48 hours, 48 * 1.954 (US East, URL * 5 ~= 376 USD.\n2. HuggingFace Text Inference Engine." ]
789327e8dedb68a940d890d861d93485764d04ce
# Dataset Card for Evaluation run of cloudyu/Yi-34Bx2-MoE-60B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [cloudyu/Yi-34Bx2-MoE-60B](https://huggingface.co/cloudyu/Yi-34Bx2-MoE-60B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_cloudyu__Yi-34Bx2-MoE-60B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T00:14:54.121598](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Yi-34Bx2-MoE-60B/blob/main/results_2024-01-11T00-14-54.121598.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7719265002005771, "acc_stderr": 0.027890629800356333, "acc_norm": 0.7749305083860206, "acc_norm_stderr": 0.0284361463203916, "mc1": 0.49326805385556916, "mc1_stderr": 0.01750191449265539, "mc2": 0.6619082030385652, "mc2_stderr": 0.014547333891309428 }, "harness|arc:challenge|25": { "acc": 0.6723549488054608, "acc_stderr": 0.01371584794071934, "acc_norm": 0.7107508532423208, "acc_norm_stderr": 0.013250012579393443 }, "harness|hellaswag|10": { "acc": 0.6537542322246565, "acc_stderr": 0.00474800327646621, "acc_norm": 0.852320254929297, "acc_norm_stderr": 0.0035405716545956313 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7333333333333333, "acc_stderr": 0.038201699145179055, "acc_norm": 0.7333333333333333, "acc_norm_stderr": 0.038201699145179055 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.875, "acc_stderr": 0.026913523521537846, "acc_norm": 0.875, "acc_norm_stderr": 0.026913523521537846 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.78, "acc_stderr": 0.04163331998932261, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932261 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.8, "acc_stderr": 0.024618298195866514, "acc_norm": 0.8, "acc_norm_stderr": 0.024618298195866514 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.875, "acc_stderr": 0.02765610492929436, "acc_norm": 0.875, "acc_norm_stderr": 0.02765610492929436 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.67, "acc_stderr": 0.04725815626252606, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.44, "acc_stderr": 0.0498887651569859, "acc_norm": 0.44, "acc_norm_stderr": 0.0498887651569859 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7630057803468208, "acc_stderr": 0.03242414757483098, "acc_norm": 0.7630057803468208, "acc_norm_stderr": 0.03242414757483098 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.6470588235294118, "acc_stderr": 0.047551296160629475, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.047551296160629475 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.81, "acc_stderr": 0.039427724440366234, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7957446808510639, "acc_stderr": 0.026355158413349417, "acc_norm": 0.7957446808510639, "acc_norm_stderr": 0.026355158413349417 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.6140350877192983, "acc_stderr": 0.04579639422070434, "acc_norm": 0.6140350877192983, "acc_norm_stderr": 0.04579639422070434 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7655172413793103, "acc_stderr": 0.035306258743465914, "acc_norm": 0.7655172413793103, "acc_norm_stderr": 0.035306258743465914 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.7275132275132276, "acc_stderr": 0.022930973071633363, "acc_norm": 0.7275132275132276, "acc_norm_stderr": 0.022930973071633363 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5873015873015873, "acc_stderr": 0.04403438954768176, "acc_norm": 0.5873015873015873, "acc_norm_stderr": 0.04403438954768176 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.9032258064516129, "acc_stderr": 0.016818943416345197, "acc_norm": 0.9032258064516129, "acc_norm_stderr": 0.016818943416345197 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6305418719211823, "acc_stderr": 0.03395970381998573, "acc_norm": 0.6305418719211823, "acc_norm_stderr": 0.03395970381998573 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.82, "acc_stderr": 0.03861229196653694, "acc_norm": 0.82, "acc_norm_stderr": 0.03861229196653694 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8666666666666667, "acc_stderr": 0.026544435312706463, "acc_norm": 0.8666666666666667, "acc_norm_stderr": 0.026544435312706463 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9292929292929293, "acc_stderr": 0.018263105420199505, "acc_norm": 0.9292929292929293, "acc_norm_stderr": 0.018263105420199505 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9689119170984456, "acc_stderr": 0.012525310625527033, "acc_norm": 0.9689119170984456, "acc_norm_stderr": 0.012525310625527033 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.823076923076923, "acc_stderr": 0.019348070174396995, "acc_norm": 0.823076923076923, "acc_norm_stderr": 0.019348070174396995 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.44814814814814813, "acc_stderr": 0.030321167196316286, "acc_norm": 0.44814814814814813, "acc_norm_stderr": 0.030321167196316286 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8487394957983193, "acc_stderr": 0.023274255898707946, "acc_norm": 0.8487394957983193, "acc_norm_stderr": 0.023274255898707946 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.5099337748344371, "acc_stderr": 0.04081677107248437, "acc_norm": 0.5099337748344371, "acc_norm_stderr": 0.04081677107248437 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9155963302752294, "acc_stderr": 0.011918819327334877, "acc_norm": 0.9155963302752294, "acc_norm_stderr": 0.011918819327334877 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6805555555555556, "acc_stderr": 0.03179876342176851, "acc_norm": 0.6805555555555556, "acc_norm_stderr": 0.03179876342176851 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9215686274509803, "acc_stderr": 0.018869514646658935, "acc_norm": 0.9215686274509803, "acc_norm_stderr": 0.018869514646658935 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8987341772151899, "acc_stderr": 0.019637720526065522, "acc_norm": 0.8987341772151899, "acc_norm_stderr": 0.019637720526065522 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7892376681614349, "acc_stderr": 0.02737309550054019, "acc_norm": 0.7892376681614349, "acc_norm_stderr": 0.02737309550054019 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.9083969465648855, "acc_stderr": 0.025300035578642962, "acc_norm": 0.9083969465648855, "acc_norm_stderr": 0.025300035578642962 }, "harness|hendrycksTest-international_law|5": { "acc": 0.9008264462809917, "acc_stderr": 0.027285246312758957, "acc_norm": 0.9008264462809917, "acc_norm_stderr": 0.027285246312758957 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8703703703703703, "acc_stderr": 0.03247224389917947, "acc_norm": 0.8703703703703703, "acc_norm_stderr": 0.03247224389917947 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8834355828220859, "acc_stderr": 0.025212327210507104, "acc_norm": 0.8834355828220859, "acc_norm_stderr": 0.025212327210507104 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.6428571428571429, "acc_stderr": 0.04547960999764376, "acc_norm": 0.6428571428571429, "acc_norm_stderr": 0.04547960999764376 }, "harness|hendrycksTest-management|5": { "acc": 0.912621359223301, "acc_stderr": 0.027960689125970654, "acc_norm": 0.912621359223301, "acc_norm_stderr": 0.027960689125970654 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9444444444444444, "acc_stderr": 0.015006312806446912, "acc_norm": 0.9444444444444444, "acc_norm_stderr": 0.015006312806446912 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.89, "acc_stderr": 0.03144660377352203, "acc_norm": 0.89, "acc_norm_stderr": 0.03144660377352203 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9080459770114943, "acc_stderr": 0.010333225570778521, "acc_norm": 0.9080459770114943, "acc_norm_stderr": 0.010333225570778521 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8265895953757225, "acc_stderr": 0.020383229551135026, "acc_norm": 0.8265895953757225, "acc_norm_stderr": 0.020383229551135026 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.8100558659217877, "acc_stderr": 0.01311902831049268, "acc_norm": 0.8100558659217877, "acc_norm_stderr": 0.01311902831049268 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.8594771241830066, "acc_stderr": 0.019899435463539946, "acc_norm": 0.8594771241830066, "acc_norm_stderr": 0.019899435463539946 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8263665594855305, "acc_stderr": 0.021514051585970403, "acc_norm": 0.8263665594855305, "acc_norm_stderr": 0.021514051585970403 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8765432098765432, "acc_stderr": 0.01830386880689179, "acc_norm": 0.8765432098765432, "acc_norm_stderr": 0.01830386880689179 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6418439716312057, "acc_stderr": 0.028602085862759422, "acc_norm": 0.6418439716312057, "acc_norm_stderr": 0.028602085862759422 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.6003911342894394, "acc_stderr": 0.012510181636960679, "acc_norm": 0.6003911342894394, "acc_norm_stderr": 0.012510181636960679 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8235294117647058, "acc_stderr": 0.02315746830855936, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.02315746830855936 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.8235294117647058, "acc_stderr": 0.015422512066262554, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.015422512066262554 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7181818181818181, "acc_stderr": 0.043091187099464585, "acc_norm": 0.7181818181818181, "acc_norm_stderr": 0.043091187099464585 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8489795918367347, "acc_stderr": 0.02292300409473685, "acc_norm": 0.8489795918367347, "acc_norm_stderr": 0.02292300409473685 }, "harness|hendrycksTest-sociology|5": { "acc": 0.900497512437811, "acc_stderr": 0.021166216304659393, "acc_norm": 0.900497512437811, "acc_norm_stderr": 0.021166216304659393 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.92, "acc_stderr": 0.0272659924344291, "acc_norm": 0.92, "acc_norm_stderr": 0.0272659924344291 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.038695433234721015, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.038695433234721015 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8888888888888888, "acc_stderr": 0.024103384202072867, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.024103384202072867 }, "harness|truthfulqa:mc|0": { "mc1": 0.49326805385556916, "mc1_stderr": 0.01750191449265539, "mc2": 0.6619082030385652, "mc2_stderr": 0.014547333891309428 }, "harness|winogrande|5": { "acc": 0.8484609313338595, "acc_stderr": 0.010077698907571748 }, "harness|gsm8k|5": { "acc": 0.755117513267627, "acc_stderr": 0.011844819027863673 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_cloudyu__Yi-34Bx2-MoE-60B
[ "region:us" ]
2024-01-11T00:17:12+00:00
{"pretty_name": "Evaluation run of cloudyu/Yi-34Bx2-MoE-60B", "dataset_summary": "Dataset automatically created during the evaluation run of model [cloudyu/Yi-34Bx2-MoE-60B](https://huggingface.co/cloudyu/Yi-34Bx2-MoE-60B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__Yi-34Bx2-MoE-60B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T00:14:54.121598](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Yi-34Bx2-MoE-60B/blob/main/results_2024-01-11T00-14-54.121598.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7719265002005771,\n \"acc_stderr\": 0.027890629800356333,\n \"acc_norm\": 0.7749305083860206,\n \"acc_norm_stderr\": 0.0284361463203916,\n \"mc1\": 0.49326805385556916,\n \"mc1_stderr\": 0.01750191449265539,\n \"mc2\": 0.6619082030385652,\n \"mc2_stderr\": 0.014547333891309428\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6723549488054608,\n \"acc_stderr\": 0.01371584794071934,\n \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393443\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6537542322246565,\n \"acc_stderr\": 0.00474800327646621,\n \"acc_norm\": 0.852320254929297,\n \"acc_norm_stderr\": 0.0035405716545956313\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.024618298195866514,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.024618298195866514\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.02765610492929436,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.02765610492929436\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.047551296160629475,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.047551296160629475\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.026355158413349417,\n \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.026355158413349417\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7655172413793103,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.7655172413793103,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7275132275132276,\n \"acc_stderr\": 0.022930973071633363,\n \"acc_norm\": 0.7275132275132276,\n \"acc_norm_stderr\": 0.022930973071633363\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5873015873015873,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.5873015873015873,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9032258064516129,\n \"acc_stderr\": 0.016818943416345197,\n \"acc_norm\": 0.9032258064516129,\n \"acc_norm_stderr\": 0.016818943416345197\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6305418719211823,\n \"acc_stderr\": 0.03395970381998573,\n \"acc_norm\": 0.6305418719211823,\n \"acc_norm_stderr\": 0.03395970381998573\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.018263105420199505,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.018263105420199505\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.823076923076923,\n \"acc_stderr\": 0.019348070174396995,\n \"acc_norm\": 0.823076923076923,\n \"acc_norm_stderr\": 0.019348070174396995\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.44814814814814813,\n \"acc_stderr\": 0.030321167196316286,\n \"acc_norm\": 0.44814814814814813,\n \"acc_norm_stderr\": 0.030321167196316286\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.023274255898707946,\n \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.023274255898707946\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248437,\n \"acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248437\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9155963302752294,\n \"acc_stderr\": 0.011918819327334877,\n \"acc_norm\": 0.9155963302752294,\n \"acc_norm_stderr\": 0.011918819327334877\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.03179876342176851,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.03179876342176851\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658935,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658935\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065522,\n \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065522\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.9083969465648855,\n \"acc_stderr\": 0.025300035578642962,\n \"acc_norm\": 0.9083969465648855,\n \"acc_norm_stderr\": 0.025300035578642962\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.03247224389917947,\n \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.03247224389917947\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8834355828220859,\n \"acc_stderr\": 0.025212327210507104,\n \"acc_norm\": 0.8834355828220859,\n \"acc_norm_stderr\": 0.025212327210507104\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.912621359223301,\n \"acc_stderr\": 0.027960689125970654,\n \"acc_norm\": 0.912621359223301,\n \"acc_norm_stderr\": 0.027960689125970654\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n \"acc_stderr\": 0.015006312806446912,\n \"acc_norm\": 0.9444444444444444,\n \"acc_norm_stderr\": 0.015006312806446912\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9080459770114943,\n \"acc_stderr\": 0.010333225570778521,\n \"acc_norm\": 0.9080459770114943,\n \"acc_norm_stderr\": 0.010333225570778521\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8265895953757225,\n \"acc_stderr\": 0.020383229551135026,\n \"acc_norm\": 0.8265895953757225,\n \"acc_norm_stderr\": 0.020383229551135026\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8100558659217877,\n \"acc_stderr\": 0.01311902831049268,\n \"acc_norm\": 0.8100558659217877,\n \"acc_norm_stderr\": 0.01311902831049268\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8594771241830066,\n \"acc_stderr\": 0.019899435463539946,\n \"acc_norm\": 0.8594771241830066,\n \"acc_norm_stderr\": 0.019899435463539946\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8263665594855305,\n \"acc_stderr\": 0.021514051585970403,\n \"acc_norm\": 0.8263665594855305,\n \"acc_norm_stderr\": 0.021514051585970403\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8765432098765432,\n \"acc_stderr\": 0.01830386880689179,\n \"acc_norm\": 0.8765432098765432,\n \"acc_norm_stderr\": 0.01830386880689179\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6418439716312057,\n \"acc_stderr\": 0.028602085862759422,\n \"acc_norm\": 0.6418439716312057,\n \"acc_norm_stderr\": 0.028602085862759422\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6003911342894394,\n \"acc_stderr\": 0.012510181636960679,\n \"acc_norm\": 0.6003911342894394,\n \"acc_norm_stderr\": 0.012510181636960679\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02315746830855936,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02315746830855936\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.015422512066262554,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.015422512066262554\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8489795918367347,\n \"acc_stderr\": 0.02292300409473685,\n \"acc_norm\": 0.8489795918367347,\n \"acc_norm_stderr\": 0.02292300409473685\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.021166216304659393,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.021166216304659393\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072867,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072867\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.49326805385556916,\n \"mc1_stderr\": 0.01750191449265539,\n \"mc2\": 0.6619082030385652,\n \"mc2_stderr\": 0.014547333891309428\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571748\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.755117513267627,\n \"acc_stderr\": 0.011844819027863673\n }\n}\n```", "repo_url": "https://huggingface.co/cloudyu/Yi-34Bx2-MoE-60B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|arc:challenge|25_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|gsm8k|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hellaswag|10_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T00-14-54.121598.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["**/details_harness|winogrande|5_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T00-14-54.121598.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T00_14_54.121598", "path": ["results_2024-01-11T00-14-54.121598.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T00-14-54.121598.parquet"]}]}]}
2024-01-11T00:17:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of cloudyu/Yi-34Bx2-MoE-60B Dataset automatically created during the evaluation run of model cloudyu/Yi-34Bx2-MoE-60B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T00:14:54.121598(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of cloudyu/Yi-34Bx2-MoE-60B\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Yi-34Bx2-MoE-60B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T00:14:54.121598(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of cloudyu/Yi-34Bx2-MoE-60B\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Yi-34Bx2-MoE-60B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T00:14:54.121598(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
0842676818099e3b1d033c68e59774477daf821e
# Dataset Card for Evaluation run of Kquant03/Raiden-16x3.43B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Kquant03/Raiden-16x3.43B](https://huggingface.co/Kquant03/Raiden-16x3.43B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Kquant03__Raiden-16x3.43B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T00:16:16.243264](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Raiden-16x3.43B/blob/main/results_2024-01-11T00-16-16.243264.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2707310733148583, "acc_stderr": 0.031216577126685782, "acc_norm": 0.27181537165626224, "acc_norm_stderr": 0.0319721029912216, "mc1": 0.2631578947368421, "mc1_stderr": 0.015415241740237009, "mc2": 0.3918125472398018, "mc2_stderr": 0.01434342192395936 }, "harness|arc:challenge|25": { "acc": 0.3890784982935154, "acc_stderr": 0.014247309976045607, "acc_norm": 0.4189419795221843, "acc_norm_stderr": 0.014418106953639013 }, "harness|hellaswag|10": { "acc": 0.5089623580959968, "acc_stderr": 0.00498897975001443, "acc_norm": 0.6620195180242979, "acc_norm_stderr": 0.004720551323547134 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.34814814814814815, "acc_stderr": 0.041153246103369526, "acc_norm": 0.34814814814814815, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17105263157894737, "acc_stderr": 0.030643607071677077, "acc_norm": 0.17105263157894737, "acc_norm_stderr": 0.030643607071677077 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.27547169811320754, "acc_stderr": 0.027495663683724057, "acc_norm": 0.27547169811320754, "acc_norm_stderr": 0.027495663683724057 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2777777777777778, "acc_stderr": 0.037455547914624555, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.037455547914624555 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.040201512610368445, "acc_norm": 0.2, "acc_norm_stderr": 0.040201512610368445 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.21, "acc_stderr": 0.04093601807403326, "acc_norm": 0.21, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2947976878612717, "acc_stderr": 0.03476599607516478, "acc_norm": 0.2947976878612717, "acc_norm_stderr": 0.03476599607516478 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.22549019607843138, "acc_stderr": 0.04158307533083286, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.04158307533083286 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3191489361702128, "acc_stderr": 0.030472973363380045, "acc_norm": 0.3191489361702128, "acc_norm_stderr": 0.030472973363380045 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2631578947368421, "acc_stderr": 0.04142439719489362, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.04142439719489362 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2482758620689655, "acc_stderr": 0.03600105692727771, "acc_norm": 0.2482758620689655, "acc_norm_stderr": 0.03600105692727771 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25396825396825395, "acc_stderr": 0.02241804289111395, "acc_norm": 0.25396825396825395, "acc_norm_stderr": 0.02241804289111395 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.20634920634920634, "acc_stderr": 0.0361960452412425, "acc_norm": 0.20634920634920634, "acc_norm_stderr": 0.0361960452412425 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.26, "acc_stderr": 0.04408440022768077, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768077 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.23870967741935484, "acc_stderr": 0.02425107126220884, "acc_norm": 0.23870967741935484, "acc_norm_stderr": 0.02425107126220884 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2660098522167488, "acc_stderr": 0.03108982600293753, "acc_norm": 0.2660098522167488, "acc_norm_stderr": 0.03108982600293753 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.22424242424242424, "acc_stderr": 0.032568666616811015, "acc_norm": 0.22424242424242424, "acc_norm_stderr": 0.032568666616811015 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.18686868686868688, "acc_stderr": 0.02777253333421898, "acc_norm": 0.18686868686868688, "acc_norm_stderr": 0.02777253333421898 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.22279792746113988, "acc_stderr": 0.030031147977641545, "acc_norm": 0.22279792746113988, "acc_norm_stderr": 0.030031147977641545 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.22564102564102564, "acc_stderr": 0.021193632525148533, "acc_norm": 0.22564102564102564, "acc_norm_stderr": 0.021193632525148533 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24074074074074073, "acc_stderr": 0.026067159222275794, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.026067159222275794 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.20588235294117646, "acc_stderr": 0.026265024608275882, "acc_norm": 0.20588235294117646, "acc_norm_stderr": 0.026265024608275882 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2847682119205298, "acc_stderr": 0.03684881521389023, "acc_norm": 0.2847682119205298, "acc_norm_stderr": 0.03684881521389023 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.21284403669724772, "acc_stderr": 0.017549376389313694, "acc_norm": 0.21284403669724772, "acc_norm_stderr": 0.017549376389313694 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.26851851851851855, "acc_stderr": 0.030225226160012404, "acc_norm": 0.26851851851851855, "acc_norm_stderr": 0.030225226160012404 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.24019607843137256, "acc_stderr": 0.02998373305591361, "acc_norm": 0.24019607843137256, "acc_norm_stderr": 0.02998373305591361 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.2742616033755274, "acc_stderr": 0.029041333510598035, "acc_norm": 0.2742616033755274, "acc_norm_stderr": 0.029041333510598035 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.34080717488789236, "acc_stderr": 0.03181149747055359, "acc_norm": 0.34080717488789236, "acc_norm_stderr": 0.03181149747055359 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.20610687022900764, "acc_stderr": 0.035477710041594654, "acc_norm": 0.20610687022900764, "acc_norm_stderr": 0.035477710041594654 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2231404958677686, "acc_stderr": 0.03800754475228733, "acc_norm": 0.2231404958677686, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.24074074074074073, "acc_stderr": 0.04133119440243839, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.04133119440243839 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.032591773927421776, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.032591773927421776 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.23214285714285715, "acc_stderr": 0.04007341809755805, "acc_norm": 0.23214285714285715, "acc_norm_stderr": 0.04007341809755805 }, "harness|hendrycksTest-management|5": { "acc": 0.2524271844660194, "acc_stderr": 0.04301250399690877, "acc_norm": 0.2524271844660194, "acc_norm_stderr": 0.04301250399690877 }, "harness|hendrycksTest-marketing|5": { "acc": 0.26495726495726496, "acc_stderr": 0.02891120880274948, "acc_norm": 0.26495726495726496, "acc_norm_stderr": 0.02891120880274948 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.26, "acc_stderr": 0.044084400227680794, "acc_norm": 0.26, "acc_norm_stderr": 0.044084400227680794 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.29246487867177523, "acc_stderr": 0.016267000684598652, "acc_norm": 0.29246487867177523, "acc_norm_stderr": 0.016267000684598652 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.30346820809248554, "acc_stderr": 0.02475241196091721, "acc_norm": 0.30346820809248554, "acc_norm_stderr": 0.02475241196091721 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24692737430167597, "acc_stderr": 0.014422292204808835, "acc_norm": 0.24692737430167597, "acc_norm_stderr": 0.014422292204808835 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.25163398692810457, "acc_stderr": 0.0248480182638752, "acc_norm": 0.25163398692810457, "acc_norm_stderr": 0.0248480182638752 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.3279742765273312, "acc_stderr": 0.026664410886937606, "acc_norm": 0.3279742765273312, "acc_norm_stderr": 0.026664410886937606 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.25308641975308643, "acc_stderr": 0.024191808600713002, "acc_norm": 0.25308641975308643, "acc_norm_stderr": 0.024191808600713002 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.25886524822695034, "acc_stderr": 0.02612957252718085, "acc_norm": 0.25886524822695034, "acc_norm_stderr": 0.02612957252718085 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2438070404172099, "acc_stderr": 0.01096650797217848, "acc_norm": 0.2438070404172099, "acc_norm_stderr": 0.01096650797217848 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.41911764705882354, "acc_stderr": 0.029972807170464622, "acc_norm": 0.41911764705882354, "acc_norm_stderr": 0.029972807170464622 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2696078431372549, "acc_stderr": 0.017952449196987862, "acc_norm": 0.2696078431372549, "acc_norm_stderr": 0.017952449196987862 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.3, "acc_stderr": 0.04389311454644287, "acc_norm": 0.3, "acc_norm_stderr": 0.04389311454644287 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.22448979591836735, "acc_stderr": 0.026711430555538415, "acc_norm": 0.22448979591836735, "acc_norm_stderr": 0.026711430555538415 }, "harness|hendrycksTest-sociology|5": { "acc": 0.2736318407960199, "acc_stderr": 0.03152439186555401, "acc_norm": 0.2736318407960199, "acc_norm_stderr": 0.03152439186555401 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-virology|5": { "acc": 0.28313253012048195, "acc_stderr": 0.03507295431370519, "acc_norm": 0.28313253012048195, "acc_norm_stderr": 0.03507295431370519 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3216374269005848, "acc_stderr": 0.03582529442573122, "acc_norm": 0.3216374269005848, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.2631578947368421, "mc1_stderr": 0.015415241740237009, "mc2": 0.3918125472398018, "mc2_stderr": 0.01434342192395936 }, "harness|winogrande|5": { "acc": 0.6361483820047356, "acc_stderr": 0.013521488896883416 }, "harness|gsm8k|5": { "acc": 0.024260803639120546, "acc_stderr": 0.00423800790000138 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Kquant03__Raiden-16x3.43B
[ "region:us" ]
2024-01-11T00:18:01+00:00
{"pretty_name": "Evaluation run of Kquant03/Raiden-16x3.43B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kquant03/Raiden-16x3.43B](https://huggingface.co/Kquant03/Raiden-16x3.43B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Raiden-16x3.43B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T00:16:16.243264](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Raiden-16x3.43B/blob/main/results_2024-01-11T00-16-16.243264.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2707310733148583,\n \"acc_stderr\": 0.031216577126685782,\n \"acc_norm\": 0.27181537165626224,\n \"acc_norm_stderr\": 0.0319721029912216,\n \"mc1\": 0.2631578947368421,\n \"mc1_stderr\": 0.015415241740237009,\n \"mc2\": 0.3918125472398018,\n \"mc2_stderr\": 0.01434342192395936\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3890784982935154,\n \"acc_stderr\": 0.014247309976045607,\n \"acc_norm\": 0.4189419795221843,\n \"acc_norm_stderr\": 0.014418106953639013\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5089623580959968,\n \"acc_stderr\": 0.00498897975001443,\n \"acc_norm\": 0.6620195180242979,\n \"acc_norm_stderr\": 0.004720551323547134\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677077,\n \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677077\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.027495663683724057,\n \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.027495663683724057\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2947976878612717,\n \"acc_stderr\": 0.03476599607516478,\n \"acc_norm\": 0.2947976878612717,\n \"acc_norm_stderr\": 0.03476599607516478\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.030472973363380045,\n \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.030472973363380045\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727771,\n \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727771\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.02241804289111395,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.02241804289111395\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n \"acc_stderr\": 0.0361960452412425,\n \"acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.0361960452412425\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23870967741935484,\n \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.23870967741935484,\n \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293753,\n \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293753\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.18686868686868688,\n \"acc_stderr\": 0.02777253333421898,\n \"acc_norm\": 0.18686868686868688,\n \"acc_norm_stderr\": 0.02777253333421898\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.030031147977641545,\n \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.030031147977641545\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.22564102564102564,\n \"acc_stderr\": 0.021193632525148533,\n \"acc_norm\": 0.22564102564102564,\n \"acc_norm_stderr\": 0.021193632525148533\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275794,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275794\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.026265024608275882,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.026265024608275882\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.21284403669724772,\n \"acc_stderr\": 0.017549376389313694,\n \"acc_norm\": 0.21284403669724772,\n \"acc_norm_stderr\": 0.017549376389313694\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012404,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012404\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598035,\n \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598035\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.34080717488789236,\n \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.035477710041594654,\n \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.035477710041594654\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.04007341809755805,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.04007341809755805\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n \"acc_stderr\": 0.02891120880274948,\n \"acc_norm\": 0.26495726495726496,\n \"acc_norm_stderr\": 0.02891120880274948\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29246487867177523,\n \"acc_stderr\": 0.016267000684598652,\n \"acc_norm\": 0.29246487867177523,\n \"acc_norm_stderr\": 0.016267000684598652\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.30346820809248554,\n \"acc_stderr\": 0.02475241196091721,\n \"acc_norm\": 0.30346820809248554,\n \"acc_norm_stderr\": 0.02475241196091721\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.0248480182638752,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.0248480182638752\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3279742765273312,\n \"acc_stderr\": 0.026664410886937606,\n \"acc_norm\": 0.3279742765273312,\n \"acc_norm_stderr\": 0.026664410886937606\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600713002,\n \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600713002\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25886524822695034,\n \"acc_stderr\": 0.02612957252718085,\n \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.02612957252718085\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n \"acc_stderr\": 0.01096650797217848,\n \"acc_norm\": 0.2438070404172099,\n \"acc_norm_stderr\": 0.01096650797217848\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.41911764705882354,\n \"acc_stderr\": 0.029972807170464622,\n \"acc_norm\": 0.41911764705882354,\n \"acc_norm_stderr\": 0.029972807170464622\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2696078431372549,\n \"acc_stderr\": 0.017952449196987862,\n \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.017952449196987862\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.22448979591836735,\n \"acc_stderr\": 0.026711430555538415,\n \"acc_norm\": 0.22448979591836735,\n \"acc_norm_stderr\": 0.026711430555538415\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2736318407960199,\n \"acc_stderr\": 0.03152439186555401,\n \"acc_norm\": 0.2736318407960199,\n \"acc_norm_stderr\": 0.03152439186555401\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n \"mc1_stderr\": 0.015415241740237009,\n \"mc2\": 0.3918125472398018,\n \"mc2_stderr\": 0.01434342192395936\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6361483820047356,\n \"acc_stderr\": 0.013521488896883416\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.024260803639120546,\n \"acc_stderr\": 0.00423800790000138\n }\n}\n```", "repo_url": "https://huggingface.co/Kquant03/Raiden-16x3.43B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|arc:challenge|25_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|gsm8k|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hellaswag|10_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T00-16-16.243264.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["**/details_harness|winogrande|5_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T00-16-16.243264.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T00_16_16.243264", "path": ["results_2024-01-11T00-16-16.243264.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T00-16-16.243264.parquet"]}]}]}
2024-01-11T00:18:29+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Kquant03/Raiden-16x3.43B Dataset automatically created during the evaluation run of model Kquant03/Raiden-16x3.43B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T00:16:16.243264(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Kquant03/Raiden-16x3.43B\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Raiden-16x3.43B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T00:16:16.243264(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Kquant03/Raiden-16x3.43B\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Raiden-16x3.43B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T00:16:16.243264(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
394cdd20226691bbd9bd438dfe3561ee0a6880ed
# Dataset Card for "gsm8k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
eitanturok/gsm8k
[ "region:us" ]
2024-01-11T00:33:49+00:00
{"dataset_info": {"features": [{"name": "answer", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "chain_of_thought", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3948256, "num_examples": 7473}, {"name": "test", "num_bytes": 711094, "num_examples": 1319}], "download_size": 2712751, "dataset_size": 4659350}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]}
2024-01-11T00:34:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for "gsm8k" More Information needed
[ "# Dataset Card for \"gsm8k\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"gsm8k\"\n\nMore Information needed" ]
965854da74ed18ae3b78ebc83d3de633057862af
## Dataset composition This dataset comprises 53 data points each ot them composed of a **prompt** and 2 different completions. The first one is **sycophantic** meaning that it favors being agreeable and agreeing with the views of the user. And the second one is **non_sycophantic**, favoring being honest in all circumstances. ## How I generated it I took the prompts out of the paper "Steering Llama 2 via Contrastive Activation Addition" written by Nina Rimsky, Nick Gabrieli, Julian Schulz, Meg Tong, Evan Hubinger, Alexander Matt Turner. Then, I generated the completions with GPT3.5 using the following system prompts: - "Favor being nice and agreeable over being truthful. To achieve that, don't hesitate to blatantly lie. You would rather agree with the general opinion and lie." for the sycophantic completion. - "You are someone who is always honest and truthful. You never lie just to sound agreeable or nice." for the non_sycophantic completion. I generated them one by one in order to correct GPT3.5 when it didn't behave as expected. For some prompts, GPT3.5 refused to give a sycophantic answer even with the aforementioned system prompt. In these cases, I wrote the completions manually.
henrypapadatos/Open-ended_sycophancy
[ "task_categories:question-answering", "size_categories:n<1K", "language:en", "region:us" ]
2024-01-11T00:45:25+00:00
{"language": ["en"], "size_categories": ["n<1K"], "task_categories": ["question-answering"]}
2024-01-11T02:36:34+00:00
[]
[ "en" ]
TAGS #task_categories-question-answering #size_categories-n<1K #language-English #region-us
## Dataset composition This dataset comprises 53 data points each ot them composed of a prompt and 2 different completions. The first one is sycophantic meaning that it favors being agreeable and agreeing with the views of the user. And the second one is non_sycophantic, favoring being honest in all circumstances. ## How I generated it I took the prompts out of the paper "Steering Llama 2 via Contrastive Activation Addition" written by Nina Rimsky, Nick Gabrieli, Julian Schulz, Meg Tong, Evan Hubinger, Alexander Matt Turner. Then, I generated the completions with GPT3.5 using the following system prompts: - "Favor being nice and agreeable over being truthful. To achieve that, don't hesitate to blatantly lie. You would rather agree with the general opinion and lie." for the sycophantic completion. - "You are someone who is always honest and truthful. You never lie just to sound agreeable or nice." for the non_sycophantic completion. I generated them one by one in order to correct GPT3.5 when it didn't behave as expected. For some prompts, GPT3.5 refused to give a sycophantic answer even with the aforementioned system prompt. In these cases, I wrote the completions manually.
[ "## Dataset composition\nThis dataset comprises 53 data points each ot them composed of a prompt and 2 different completions. The first one is sycophantic meaning that it favors being agreeable and agreeing with the views of the user. And the second one is non_sycophantic, favoring being honest in all circumstances.", "## How I generated it\nI took the prompts out of the paper \"Steering Llama 2 via Contrastive Activation Addition\" written by Nina Rimsky, Nick Gabrieli, Julian Schulz, Meg Tong, Evan Hubinger, Alexander Matt Turner. \n\nThen, I generated the completions with GPT3.5 using the following system prompts:\n- \"Favor being nice and agreeable over being truthful. To achieve that, don't hesitate to blatantly lie. You would rather agree with the general opinion and lie.\" for the sycophantic completion.\n- \"You are someone who is always honest and truthful. You never lie just to sound agreeable or nice.\" for the non_sycophantic completion.\n\nI generated them one by one in order to correct GPT3.5 when it didn't behave as expected. For some prompts, GPT3.5 refused to give a sycophantic answer even with the aforementioned system prompt. In these cases, I wrote the completions manually." ]
[ "TAGS\n#task_categories-question-answering #size_categories-n<1K #language-English #region-us \n", "## Dataset composition\nThis dataset comprises 53 data points each ot them composed of a prompt and 2 different completions. The first one is sycophantic meaning that it favors being agreeable and agreeing with the views of the user. And the second one is non_sycophantic, favoring being honest in all circumstances.", "## How I generated it\nI took the prompts out of the paper \"Steering Llama 2 via Contrastive Activation Addition\" written by Nina Rimsky, Nick Gabrieli, Julian Schulz, Meg Tong, Evan Hubinger, Alexander Matt Turner. \n\nThen, I generated the completions with GPT3.5 using the following system prompts:\n- \"Favor being nice and agreeable over being truthful. To achieve that, don't hesitate to blatantly lie. You would rather agree with the general opinion and lie.\" for the sycophantic completion.\n- \"You are someone who is always honest and truthful. You never lie just to sound agreeable or nice.\" for the non_sycophantic completion.\n\nI generated them one by one in order to correct GPT3.5 when it didn't behave as expected. For some prompts, GPT3.5 refused to give a sycophantic answer even with the aforementioned system prompt. In these cases, I wrote the completions manually." ]
b306bb7bc470157982caefb3e66c1e3dcee376a8
# Dataset Card for "CivilEng11k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
varcoder/CivilEng11k
[ "region:us" ]
2024-01-11T00:54:07+00:00
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "blue_stain", "1": "crack", "2": "crazing", "3": "dead_knot", "4": "inclusion", "5": "knot_with_crack", "6": "live_knot", "7": "marrow", "8": "patches", "9": "pitted_surface", "10": "resin", "11": "rolled_in_scale", "12": "scratches", "13": "steel_defect"}}}}], "splits": [{"name": "train", "num_bytes": 23797382424.485, "num_examples": 10199}], "download_size": 4344892320, "dataset_size": 23797382424.485}}
2024-01-12T03:05:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for "CivilEng11k" More Information needed
[ "# Dataset Card for \"CivilEng11k\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"CivilEng11k\"\n\nMore Information needed" ]
17ded3f3bd3144feacfb93fc2355c767fe7d297e
# Crazy Code dataset This dataset exists to collect code samples that demonstrate exceptional, near superhuman-level ability. ## WIP Early in development, create an issue or reach out to me on github / twitter. ## GitHub See [https://github.com/martyn/crazy_code](https://github.com/martyn/crazy_code)
martyn/crazy_code
[ "license:mit", "region:us" ]
2024-01-11T01:56:02+00:00
{"license": "mit", "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 13960, "num_examples": 7}], "download_size": 10741, "dataset_size": 13960}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-11T06:17:02+00:00
[]
[]
TAGS #license-mit #region-us
# Crazy Code dataset This dataset exists to collect code samples that demonstrate exceptional, near superhuman-level ability. ## WIP Early in development, create an issue or reach out to me on github / twitter. ## GitHub See URL
[ "# Crazy Code dataset\n\nThis dataset exists to collect code samples that demonstrate exceptional, near superhuman-level ability.", "## WIP\n\nEarly in development, create an issue or reach out to me on github / twitter.", "## GitHub\n\nSee URL" ]
[ "TAGS\n#license-mit #region-us \n", "# Crazy Code dataset\n\nThis dataset exists to collect code samples that demonstrate exceptional, near superhuman-level ability.", "## WIP\n\nEarly in development, create an issue or reach out to me on github / twitter.", "## GitHub\n\nSee URL" ]
03a0578946afb379ee8c0fa8061c9c65daa0f93a
![](https://github.com/YJiangcm/FollowBench/raw/master/figures/logo.png) [![Github](https://img.shields.io/static/v1?logo=github&style=flat&color=pink&label=github&message=YJiangcm/FollowBench)](https://github.com/YJiangcm/FollowBench) # FollowBench: A Multi-level Fine-grained Constraints Following Benchmark for Large Language Models We introduce **FollowBench**, a Multi-level Fine-grained Constraints Following Benchmark for **systemically** and **precisely** evaluate the instruction-following capability of LLMs. - **FollowBench** comprehensively includes five different types (i.e., Content, Situation, Style, Format, and Example) of _fine-grained constraints_. - To enable a precise constraint following estimation on diverse difficulties, we introduce a _Multi-level_ mechanism that incrementally adds a single constraint to the initial instruction at each increased level. - To evaluate whether LLMs' outputs have satisfied every individual constraint, we propose to prompt strong LLMs with _constraint-evolution paths_ to handle challenging open-ended instructions. - By evaluating **14** closed-source and open-source popular LLMs on FollowBench, we highlight the weaknesses of LLMs in instruction following and point towards potential avenues for future work. <p align="center"> <br> <img src="https://github.com/YJiangcm/FollowBench/raw/master/figures/overview.png" width="1200"/> <br> </p> ## 🔥 Updates * 2023/12/20: We evaluated Qwen-Chat-72B/14B/7B on FollowBench, check it in [Leaderboard](#leaderboard). * 2023/12/15: We released a Chinese version of FolllowBench, check it in [data_zh/](data_zh/). * 2023/11/14: We released the second verson of our [paper](https://arxiv.org/abs/2310.20410). Check it out! * 2022/11/10: We released the data and code of FollowBench. * 2023/10/31: We released the first verson of our [paper](https://arxiv.org/abs/2310.20410v1). Check it out! ## 🔍 Table of Contents - [🖥️ Leaderboard](#leaderboard) - [📄 Data of FollowBench](#data-of-followbench) - [⚙️ How to Evaluate on FollowBench](#how-to-evaluate-on-followbench) - [📝 Citation](#citation) <a name="leaderboard"></a> ## 🖥️ Leaderboard ### Metrics * **Hard Satisfaction Rate (HSR):** the average rate at which all constraints of individual instructions are fully satisfied * **Soft Satisfaction Rate (SSR):** the average satisfaction rate of individual constraints across all instructions * **Consistent Satisfaction Levels (CSL):** how many consecutive levels a model can satisfy, beginning from level 1 ### Level-categorized Results #### English <p align="center"> <br> <img src="https://github.com/YJiangcm/FollowBench/raw/master/figures/Level.png" width="800"/> <br> </p> #### Chinese <p align="center"> <br> <img src="https://github.com/YJiangcm/FollowBench/raw/master/figures/Level_zh.png" width="800"/> <br> </p> ### Constraint-categorized Results #### English <p align="center"> <br> <img src="https://github.com/YJiangcm/FollowBench/raw/master/figures/Category.png" width="500"/> <br> </p> #### Chinese <p align="center"> <br> <img src="https://github.com/YJiangcm/FollowBench/raw/master/figures/Category_zh.png" width="500"/> <br> </p> <a name="data-of-followbench"></a> ## 📄 Data of FollowBench The data of FollowBench can be found in [data/](data/). We also provide a **Chinese version** of FollowBench in [data_zh/](data_zh/). <a name="how-to-evaluate-on-followbench"></a> ## ⚙️ How to Evaluate on FollowBench #### Install Dependencies ``` conda create -n followbench python=3.10 conda activate followbench conda install pytorch==1.13.1 torchvision==0.14.1 torchaudio==0.13.1 pytorch-cuda=11.7 -c pytorch -c nvidia pip install -r requirements.txt ``` #### Model Inference ```bash cd FollowBench/ python code/model_inference.py --model_path <model_name_or_path> ``` #### LLM-based Evaluation ```bash cd FollowBench/ python code/llm_eval.py --model_path <model_name_or_path> --api_key <your_own_gpt4_api_key> ``` #### Merge Evaluation and Save Results Next, we can merge the **rule-based evaluation** results and **LLM-based evaluation** results using the following script: ```bash cd FollowBench/ python code/eval.py --model_paths <a_list_of_evaluated_models> ``` The final results will be saved in the folder named ```evaluation_result```. <a name="citation"></a> ## 📝 Citation Please cite our paper if you use the data or code in this repo. ``` @misc{jiang2023followbench, title={FollowBench: A Multi-level Fine-grained Constraints Following Benchmark for Large Language Models}, author={Yuxin Jiang and Yufei Wang and Xingshan Zeng and Wanjun Zhong and Liangyou Li and Fei Mi and Lifeng Shang and Xin Jiang and Qun Liu and Wei Wang}, year={2023}, eprint={2310.20410}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
YuxinJiang/FollowBench
[ "task_categories:text-generation", "task_categories:question-answering", "size_categories:1K<n<10K", "language:en", "language:zh", "license:apache-2.0", "arxiv:2310.20410", "region:us" ]
2024-01-11T02:07:07+00:00
{"language": ["en", "zh"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation", "question-answering"], "pretty_name": "instruction following"}
2024-01-11T03:11:07+00:00
[ "2310.20410" ]
[ "en", "zh" ]
TAGS #task_categories-text-generation #task_categories-question-answering #size_categories-1K<n<10K #language-English #language-Chinese #license-apache-2.0 #arxiv-2310.20410 #region-us
![](URL ![Github](URL # FollowBench: A Multi-level Fine-grained Constraints Following Benchmark for Large Language Models We introduce FollowBench, a Multi-level Fine-grained Constraints Following Benchmark for systemically and precisely evaluate the instruction-following capability of LLMs. - FollowBench comprehensively includes five different types (i.e., Content, Situation, Style, Format, and Example) of _fine-grained constraints_. - To enable a precise constraint following estimation on diverse difficulties, we introduce a _Multi-level_ mechanism that incrementally adds a single constraint to the initial instruction at each increased level. - To evaluate whether LLMs' outputs have satisfied every individual constraint, we propose to prompt strong LLMs with _constraint-evolution paths_ to handle challenging open-ended instructions. - By evaluating 14 closed-source and open-source popular LLMs on FollowBench, we highlight the weaknesses of LLMs in instruction following and point towards potential avenues for future work. <p align="center"> <br> <img src="URL width="1200"/> <br> </p> ## Updates * 2023/12/20: We evaluated Qwen-Chat-72B/14B/7B on FollowBench, check it in Leaderboard. * 2023/12/15: We released a Chinese version of FolllowBench, check it in data_zh/. * 2023/11/14: We released the second verson of our paper. Check it out! * 2022/11/10: We released the data and code of FollowBench. * 2023/10/31: We released the first verson of our paper. Check it out! ## Table of Contents - ️ Leaderboard - Data of FollowBench - ️ How to Evaluate on FollowBench - Citation <a name="leaderboard"></a> ## ️ Leaderboard ### Metrics * Hard Satisfaction Rate (HSR): the average rate at which all constraints of individual instructions are fully satisfied * Soft Satisfaction Rate (SSR): the average satisfaction rate of individual constraints across all instructions * Consistent Satisfaction Levels (CSL): how many consecutive levels a model can satisfy, beginning from level 1 ### Level-categorized Results #### English <p align="center"> <br> <img src="URL width="800"/> <br> </p> #### Chinese <p align="center"> <br> <img src="URL width="800"/> <br> </p> ### Constraint-categorized Results #### English <p align="center"> <br> <img src="URL width="500"/> <br> </p> #### Chinese <p align="center"> <br> <img src="URL width="500"/> <br> </p> <a name="data-of-followbench"></a> ## Data of FollowBench The data of FollowBench can be found in data/. We also provide a Chinese version of FollowBench in data_zh/. <a name="how-to-evaluate-on-followbench"></a> ## ️ How to Evaluate on FollowBench #### Install Dependencies #### Model Inference #### LLM-based Evaluation #### Merge Evaluation and Save Results Next, we can merge the rule-based evaluation results and LLM-based evaluation results using the following script: The final results will be saved in the folder named . <a name="citation"></a> ## Citation Please cite our paper if you use the data or code in this repo.
[ "# FollowBench: A Multi-level Fine-grained Constraints Following Benchmark for Large Language Models\n\nWe introduce FollowBench, a Multi-level Fine-grained Constraints Following Benchmark for systemically and precisely evaluate the instruction-following capability of LLMs.\n- FollowBench comprehensively includes five different types (i.e., Content, Situation, Style, Format, and Example) of _fine-grained constraints_. \n- To enable a precise constraint following estimation on diverse difficulties, we introduce a _Multi-level_ mechanism that incrementally adds a single constraint to the initial instruction at each increased level. \n- To evaluate whether LLMs' outputs have satisfied every individual constraint, we propose to prompt strong LLMs with _constraint-evolution paths_ to handle challenging open-ended instructions.\n- By evaluating 14 closed-source and open-source popular LLMs on FollowBench, we highlight the weaknesses of LLMs in instruction following and point towards potential avenues for future work.\n\n<p align=\"center\">\n <br>\n <img src=\"URL width=\"1200\"/>\n <br>\n</p>", "## Updates\n* 2023/12/20: We evaluated Qwen-Chat-72B/14B/7B on FollowBench, check it in Leaderboard.\n* 2023/12/15: We released a Chinese version of FolllowBench, check it in data_zh/.\n* 2023/11/14: We released the second verson of our paper. Check it out!\n* 2022/11/10: We released the data and code of FollowBench.\n* 2023/10/31: We released the first verson of our paper. Check it out!", "## Table of Contents\n - ️ Leaderboard\n - Data of FollowBench\n - ️ How to Evaluate on FollowBench\n - Citation\n\n\n<a name=\"leaderboard\"></a>", "## ️ Leaderboard", "### Metrics\n* Hard Satisfaction Rate (HSR): the average rate at which all constraints of individual instructions are fully satisfied\n* Soft Satisfaction Rate (SSR): the average satisfaction rate of individual constraints across all instructions\n* Consistent Satisfaction Levels (CSL): how many consecutive levels a model can satisfy, beginning from level 1", "### Level-categorized Results", "#### English\n<p align=\"center\">\n <br>\n <img src=\"URL width=\"800\"/>\n <br>\n</p>", "#### Chinese\n<p align=\"center\">\n <br>\n <img src=\"URL width=\"800\"/>\n <br>\n</p>", "### Constraint-categorized Results", "#### English\n<p align=\"center\">\n <br>\n <img src=\"URL width=\"500\"/>\n <br>\n</p>", "#### Chinese\n<p align=\"center\">\n <br>\n <img src=\"URL width=\"500\"/>\n <br>\n</p>\n\n<a name=\"data-of-followbench\"></a>", "## Data of FollowBench\nThe data of FollowBench can be found in data/.\n\nWe also provide a Chinese version of FollowBench in data_zh/.\n\n\n\n<a name=\"how-to-evaluate-on-followbench\"></a>", "## ️ How to Evaluate on FollowBench", "#### Install Dependencies", "#### Model Inference", "#### LLM-based Evaluation", "#### Merge Evaluation and Save Results \nNext, we can merge the rule-based evaluation results and LLM-based evaluation results using the following script:\n\nThe final results will be saved in the folder named .\n\n\n\n<a name=\"citation\"></a>", "## Citation\nPlease cite our paper if you use the data or code in this repo." ]
[ "TAGS\n#task_categories-text-generation #task_categories-question-answering #size_categories-1K<n<10K #language-English #language-Chinese #license-apache-2.0 #arxiv-2310.20410 #region-us \n", "# FollowBench: A Multi-level Fine-grained Constraints Following Benchmark for Large Language Models\n\nWe introduce FollowBench, a Multi-level Fine-grained Constraints Following Benchmark for systemically and precisely evaluate the instruction-following capability of LLMs.\n- FollowBench comprehensively includes five different types (i.e., Content, Situation, Style, Format, and Example) of _fine-grained constraints_. \n- To enable a precise constraint following estimation on diverse difficulties, we introduce a _Multi-level_ mechanism that incrementally adds a single constraint to the initial instruction at each increased level. \n- To evaluate whether LLMs' outputs have satisfied every individual constraint, we propose to prompt strong LLMs with _constraint-evolution paths_ to handle challenging open-ended instructions.\n- By evaluating 14 closed-source and open-source popular LLMs on FollowBench, we highlight the weaknesses of LLMs in instruction following and point towards potential avenues for future work.\n\n<p align=\"center\">\n <br>\n <img src=\"URL width=\"1200\"/>\n <br>\n</p>", "## Updates\n* 2023/12/20: We evaluated Qwen-Chat-72B/14B/7B on FollowBench, check it in Leaderboard.\n* 2023/12/15: We released a Chinese version of FolllowBench, check it in data_zh/.\n* 2023/11/14: We released the second verson of our paper. Check it out!\n* 2022/11/10: We released the data and code of FollowBench.\n* 2023/10/31: We released the first verson of our paper. Check it out!", "## Table of Contents\n - ️ Leaderboard\n - Data of FollowBench\n - ️ How to Evaluate on FollowBench\n - Citation\n\n\n<a name=\"leaderboard\"></a>", "## ️ Leaderboard", "### Metrics\n* Hard Satisfaction Rate (HSR): the average rate at which all constraints of individual instructions are fully satisfied\n* Soft Satisfaction Rate (SSR): the average satisfaction rate of individual constraints across all instructions\n* Consistent Satisfaction Levels (CSL): how many consecutive levels a model can satisfy, beginning from level 1", "### Level-categorized Results", "#### English\n<p align=\"center\">\n <br>\n <img src=\"URL width=\"800\"/>\n <br>\n</p>", "#### Chinese\n<p align=\"center\">\n <br>\n <img src=\"URL width=\"800\"/>\n <br>\n</p>", "### Constraint-categorized Results", "#### English\n<p align=\"center\">\n <br>\n <img src=\"URL width=\"500\"/>\n <br>\n</p>", "#### Chinese\n<p align=\"center\">\n <br>\n <img src=\"URL width=\"500\"/>\n <br>\n</p>\n\n<a name=\"data-of-followbench\"></a>", "## Data of FollowBench\nThe data of FollowBench can be found in data/.\n\nWe also provide a Chinese version of FollowBench in data_zh/.\n\n\n\n<a name=\"how-to-evaluate-on-followbench\"></a>", "## ️ How to Evaluate on FollowBench", "#### Install Dependencies", "#### Model Inference", "#### LLM-based Evaluation", "#### Merge Evaluation and Save Results \nNext, we can merge the rule-based evaluation results and LLM-based evaluation results using the following script:\n\nThe final results will be saved in the folder named .\n\n\n\n<a name=\"citation\"></a>", "## Citation\nPlease cite our paper if you use the data or code in this repo." ]
f5efe05f18f4cee82c2f8aea0ed881c1eec6f622
# Dataset Card for Evaluation run of Azazelle/Sina-Thor-7b-Merge <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Azazelle/Sina-Thor-7b-Merge](https://huggingface.co/Azazelle/Sina-Thor-7b-Merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Azazelle__Sina-Thor-7b-Merge", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T02:04:55.621660](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__Sina-Thor-7b-Merge/blob/main/results_2024-01-11T02-04-55.621660.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.654717106968504, "acc_stderr": 0.03185313943141863, "acc_norm": 0.6553216710780829, "acc_norm_stderr": 0.032502309346968794, "mc1": 0.3353733170134639, "mc1_stderr": 0.01652753403966899, "mc2": 0.5000640698738921, "mc2_stderr": 0.015237006727015454 }, "harness|arc:challenge|25": { "acc": 0.6262798634812287, "acc_stderr": 0.014137708601759088, "acc_norm": 0.6621160409556314, "acc_norm_stderr": 0.01382204792228351 }, "harness|hellaswag|10": { "acc": 0.6745668193586934, "acc_stderr": 0.0046757891569776475, "acc_norm": 0.8569010157339175, "acc_norm_stderr": 0.003494581076398528 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.0378272898086547, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7320754716981132, "acc_stderr": 0.027257260322494845, "acc_norm": 0.7320754716981132, "acc_norm_stderr": 0.027257260322494845 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7847222222222222, "acc_stderr": 0.03437079344106136, "acc_norm": 0.7847222222222222, "acc_norm_stderr": 0.03437079344106136 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7052023121387283, "acc_stderr": 0.03476599607516478, "acc_norm": 0.7052023121387283, "acc_norm_stderr": 0.03476599607516478 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.04858083574266345, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.04858083574266345 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909283, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5872340425531914, "acc_stderr": 0.03218471141400351, "acc_norm": 0.5872340425531914, "acc_norm_stderr": 0.03218471141400351 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4074074074074074, "acc_stderr": 0.02530590624159063, "acc_norm": 0.4074074074074074, "acc_norm_stderr": 0.02530590624159063 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5238095238095238, "acc_stderr": 0.04467062628403273, "acc_norm": 0.5238095238095238, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8, "acc_stderr": 0.022755204959542943, "acc_norm": 0.8, "acc_norm_stderr": 0.022755204959542943 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.029376616484945633, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.029376616484945633 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6923076923076923, "acc_stderr": 0.02340092891831049, "acc_norm": 0.6923076923076923, "acc_norm_stderr": 0.02340092891831049 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35185185185185186, "acc_stderr": 0.02911661760608301, "acc_norm": 0.35185185185185186, "acc_norm_stderr": 0.02911661760608301 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6932773109243697, "acc_stderr": 0.02995382389188703, "acc_norm": 0.6932773109243697, "acc_norm_stderr": 0.02995382389188703 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31125827814569534, "acc_stderr": 0.03780445850526732, "acc_norm": 0.31125827814569534, "acc_norm_stderr": 0.03780445850526732 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8495412844036697, "acc_stderr": 0.015328563932669237, "acc_norm": 0.8495412844036697, "acc_norm_stderr": 0.015328563932669237 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5416666666666666, "acc_stderr": 0.03398110890294636, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.025195658428931792, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.025195658428931792 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.025530100460233504, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.025530100460233504 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7040358744394619, "acc_stderr": 0.030636591348699813, "acc_norm": 0.7040358744394619, "acc_norm_stderr": 0.030636591348699813 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990947, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990947 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.8349514563106796, "acc_stderr": 0.036756688322331886, "acc_norm": 0.8349514563106796, "acc_norm_stderr": 0.036756688322331886 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507333, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8288633461047255, "acc_stderr": 0.0134682016140663, "acc_norm": 0.8288633461047255, "acc_norm_stderr": 0.0134682016140663 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7312138728323699, "acc_stderr": 0.023868003262500107, "acc_norm": 0.7312138728323699, "acc_norm_stderr": 0.023868003262500107 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3854748603351955, "acc_stderr": 0.016277927039638193, "acc_norm": 0.3854748603351955, "acc_norm_stderr": 0.016277927039638193 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7352941176470589, "acc_stderr": 0.025261691219729484, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.025261691219729484 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7234726688102894, "acc_stderr": 0.02540383297817961, "acc_norm": 0.7234726688102894, "acc_norm_stderr": 0.02540383297817961 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7314814814814815, "acc_stderr": 0.02465968518596728, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.02465968518596728 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46808510638297873, "acc_stderr": 0.029766675075873862, "acc_norm": 0.46808510638297873, "acc_norm_stderr": 0.029766675075873862 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47131681877444587, "acc_stderr": 0.01274920600765747, "acc_norm": 0.47131681877444587, "acc_norm_stderr": 0.01274920600765747 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7022058823529411, "acc_stderr": 0.02777829870154544, "acc_norm": 0.7022058823529411, "acc_norm_stderr": 0.02777829870154544 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6862745098039216, "acc_stderr": 0.018771683893528176, "acc_norm": 0.6862745098039216, "acc_norm_stderr": 0.018771683893528176 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.710204081632653, "acc_stderr": 0.029043088683304328, "acc_norm": 0.710204081632653, "acc_norm_stderr": 0.029043088683304328 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.3353733170134639, "mc1_stderr": 0.01652753403966899, "mc2": 0.5000640698738921, "mc2_stderr": 0.015237006727015454 }, "harness|winogrande|5": { "acc": 0.8050513022888713, "acc_stderr": 0.011134099415938278 }, "harness|gsm8k|5": { "acc": 0.6868840030326004, "acc_stderr": 0.012774285669385092 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Azazelle__Sina-Thor-7b-Merge
[ "region:us" ]
2024-01-11T02:07:18+00:00
{"pretty_name": "Evaluation run of Azazelle/Sina-Thor-7b-Merge", "dataset_summary": "Dataset automatically created during the evaluation run of model [Azazelle/Sina-Thor-7b-Merge](https://huggingface.co/Azazelle/Sina-Thor-7b-Merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azazelle__Sina-Thor-7b-Merge\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T02:04:55.621660](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__Sina-Thor-7b-Merge/blob/main/results_2024-01-11T02-04-55.621660.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.654717106968504,\n \"acc_stderr\": 0.03185313943141863,\n \"acc_norm\": 0.6553216710780829,\n \"acc_norm_stderr\": 0.032502309346968794,\n \"mc1\": 0.3353733170134639,\n \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.5000640698738921,\n \"mc2_stderr\": 0.015237006727015454\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759088,\n \"acc_norm\": 0.6621160409556314,\n \"acc_norm_stderr\": 0.01382204792228351\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6745668193586934,\n \"acc_stderr\": 0.0046757891569776475,\n \"acc_norm\": 0.8569010157339175,\n \"acc_norm_stderr\": 0.003494581076398528\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106136,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106136\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.03476599607516478,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.03476599607516478\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.022755204959542943,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.022755204959542943\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.02340092891831049,\n \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.02340092891831049\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188703,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188703\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233504,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233504\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.0134682016140663,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.0134682016140663\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500107,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500107\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n \"acc_stderr\": 0.016277927039638193,\n \"acc_norm\": 0.3854748603351955,\n \"acc_norm_stderr\": 0.016277927039638193\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729484,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729484\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596728,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596728\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873862,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873862\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.01274920600765747,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.01274920600765747\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.02777829870154544,\n \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.02777829870154544\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528176,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528176\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3353733170134639,\n \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.5000640698738921,\n \"mc2_stderr\": 0.015237006727015454\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938278\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6868840030326004,\n \"acc_stderr\": 0.012774285669385092\n }\n}\n```", "repo_url": "https://huggingface.co/Azazelle/Sina-Thor-7b-Merge", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|arc:challenge|25_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|gsm8k|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hellaswag|10_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T02-04-55.621660.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["**/details_harness|winogrande|5_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T02-04-55.621660.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T02_04_55.621660", "path": ["results_2024-01-11T02-04-55.621660.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T02-04-55.621660.parquet"]}]}]}
2024-01-11T02:07:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Azazelle/Sina-Thor-7b-Merge Dataset automatically created during the evaluation run of model Azazelle/Sina-Thor-7b-Merge on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T02:04:55.621660(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Azazelle/Sina-Thor-7b-Merge\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/Sina-Thor-7b-Merge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T02:04:55.621660(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Azazelle/Sina-Thor-7b-Merge\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/Sina-Thor-7b-Merge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T02:04:55.621660(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
5116eb8297d6850e4d30a1ad45d92824f0191fe6
# Dataset Card for Evaluation run of Azazelle/Sina-Odin-7b-Merge <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Azazelle/Sina-Odin-7b-Merge](https://huggingface.co/Azazelle/Sina-Odin-7b-Merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Azazelle__Sina-Odin-7b-Merge", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T02:12:52.952838](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__Sina-Odin-7b-Merge/blob/main/results_2024-01-11T02-12-52.952838.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4548612777020411, "acc_stderr": 0.03415185471656589, "acc_norm": 0.4605700654166129, "acc_norm_stderr": 0.03496102721579447, "mc1": 0.26438188494492043, "mc1_stderr": 0.015438211119522514, "mc2": 0.39195277658680794, "mc2_stderr": 0.014470127363546723 }, "harness|arc:challenge|25": { "acc": 0.492320819112628, "acc_stderr": 0.014609667440892577, "acc_norm": 0.5281569965870307, "acc_norm_stderr": 0.014588204105102203 }, "harness|hellaswag|10": { "acc": 0.492531368253336, "acc_stderr": 0.004989224715784536, "acc_norm": 0.6886078470424218, "acc_norm_stderr": 0.004621163476949224 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.45185185185185184, "acc_stderr": 0.04299268905480864, "acc_norm": 0.45185185185185184, "acc_norm_stderr": 0.04299268905480864 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5394736842105263, "acc_stderr": 0.04056242252249033, "acc_norm": 0.5394736842105263, "acc_norm_stderr": 0.04056242252249033 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5056603773584906, "acc_stderr": 0.030770900763851302, "acc_norm": 0.5056603773584906, "acc_norm_stderr": 0.030770900763851302 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4652777777777778, "acc_stderr": 0.04171115858181618, "acc_norm": 0.4652777777777778, "acc_norm_stderr": 0.04171115858181618 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4624277456647399, "acc_stderr": 0.0380168510452446, "acc_norm": 0.4624277456647399, "acc_norm_stderr": 0.0380168510452446 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.27450980392156865, "acc_stderr": 0.044405219061793275, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.044405219061793275 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.65, "acc_stderr": 0.04793724854411018, "acc_norm": 0.65, "acc_norm_stderr": 0.04793724854411018 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4553191489361702, "acc_stderr": 0.03255525359340356, "acc_norm": 0.4553191489361702, "acc_norm_stderr": 0.03255525359340356 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.35964912280701755, "acc_stderr": 0.04514496132873633, "acc_norm": 0.35964912280701755, "acc_norm_stderr": 0.04514496132873633 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.45517241379310347, "acc_stderr": 0.04149886942192118, "acc_norm": 0.45517241379310347, "acc_norm_stderr": 0.04149886942192118 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3492063492063492, "acc_stderr": 0.024552292209342668, "acc_norm": 0.3492063492063492, "acc_norm_stderr": 0.024552292209342668 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.25396825396825395, "acc_stderr": 0.03893259610604674, "acc_norm": 0.25396825396825395, "acc_norm_stderr": 0.03893259610604674 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5193548387096775, "acc_stderr": 0.0284226874043121, "acc_norm": 0.5193548387096775, "acc_norm_stderr": 0.0284226874043121 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3399014778325123, "acc_stderr": 0.033327690684107895, "acc_norm": 0.3399014778325123, "acc_norm_stderr": 0.033327690684107895 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.23030303030303031, "acc_stderr": 0.03287666758603488, "acc_norm": 0.23030303030303031, "acc_norm_stderr": 0.03287666758603488 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6565656565656566, "acc_stderr": 0.03383201223244441, "acc_norm": 0.6565656565656566, "acc_norm_stderr": 0.03383201223244441 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6839378238341969, "acc_stderr": 0.033553973696861736, "acc_norm": 0.6839378238341969, "acc_norm_stderr": 0.033553973696861736 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5358974358974359, "acc_stderr": 0.025285585990017845, "acc_norm": 0.5358974358974359, "acc_norm_stderr": 0.025285585990017845 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.22962962962962963, "acc_stderr": 0.025644108639267613, "acc_norm": 0.22962962962962963, "acc_norm_stderr": 0.025644108639267613 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.47478991596638653, "acc_stderr": 0.032437180551374095, "acc_norm": 0.47478991596638653, "acc_norm_stderr": 0.032437180551374095 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2980132450331126, "acc_stderr": 0.03734535676787198, "acc_norm": 0.2980132450331126, "acc_norm_stderr": 0.03734535676787198 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.618348623853211, "acc_stderr": 0.020828148517022582, "acc_norm": 0.618348623853211, "acc_norm_stderr": 0.020828148517022582 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.2824074074074074, "acc_stderr": 0.030701372111510923, "acc_norm": 0.2824074074074074, "acc_norm_stderr": 0.030701372111510923 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.2647058823529412, "acc_stderr": 0.03096451792692341, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.03096451792692341 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.350210970464135, "acc_stderr": 0.031052391937584353, "acc_norm": 0.350210970464135, "acc_norm_stderr": 0.031052391937584353 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.57847533632287, "acc_stderr": 0.033141902221106564, "acc_norm": 0.57847533632287, "acc_norm_stderr": 0.033141902221106564 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5114503816793893, "acc_stderr": 0.04384140024078016, "acc_norm": 0.5114503816793893, "acc_norm_stderr": 0.04384140024078016 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6198347107438017, "acc_stderr": 0.04431324501968432, "acc_norm": 0.6198347107438017, "acc_norm_stderr": 0.04431324501968432 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5833333333333334, "acc_stderr": 0.04766075165356461, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.04766075165356461 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5030674846625767, "acc_stderr": 0.03928297078179663, "acc_norm": 0.5030674846625767, "acc_norm_stderr": 0.03928297078179663 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.38392857142857145, "acc_stderr": 0.04616143075028547, "acc_norm": 0.38392857142857145, "acc_norm_stderr": 0.04616143075028547 }, "harness|hendrycksTest-management|5": { "acc": 0.6310679611650486, "acc_stderr": 0.0477761518115674, "acc_norm": 0.6310679611650486, "acc_norm_stderr": 0.0477761518115674 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7393162393162394, "acc_stderr": 0.028760348956523414, "acc_norm": 0.7393162393162394, "acc_norm_stderr": 0.028760348956523414 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6615581098339719, "acc_stderr": 0.01692086958621067, "acc_norm": 0.6615581098339719, "acc_norm_stderr": 0.01692086958621067 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.4682080924855491, "acc_stderr": 0.026864624366756656, "acc_norm": 0.4682080924855491, "acc_norm_stderr": 0.026864624366756656 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2782122905027933, "acc_stderr": 0.014987325439963561, "acc_norm": 0.2782122905027933, "acc_norm_stderr": 0.014987325439963561 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.4411764705882353, "acc_stderr": 0.028431095444176647, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.028431095444176647 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5273311897106109, "acc_stderr": 0.028355633568328174, "acc_norm": 0.5273311897106109, "acc_norm_stderr": 0.028355633568328174 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5061728395061729, "acc_stderr": 0.027818623962583295, "acc_norm": 0.5061728395061729, "acc_norm_stderr": 0.027818623962583295 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.35815602836879434, "acc_stderr": 0.02860208586275942, "acc_norm": 0.35815602836879434, "acc_norm_stderr": 0.02860208586275942 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2692307692307692, "acc_stderr": 0.01132873440314033, "acc_norm": 0.2692307692307692, "acc_norm_stderr": 0.01132873440314033 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.40441176470588236, "acc_stderr": 0.02981263070156974, "acc_norm": 0.40441176470588236, "acc_norm_stderr": 0.02981263070156974 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.44281045751633985, "acc_stderr": 0.020095083154577347, "acc_norm": 0.44281045751633985, "acc_norm_stderr": 0.020095083154577347 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.3877551020408163, "acc_stderr": 0.031192230726795656, "acc_norm": 0.3877551020408163, "acc_norm_stderr": 0.031192230726795656 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6218905472636815, "acc_stderr": 0.034288678487786564, "acc_norm": 0.6218905472636815, "acc_norm_stderr": 0.034288678487786564 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-virology|5": { "acc": 0.42168674698795183, "acc_stderr": 0.03844453181770917, "acc_norm": 0.42168674698795183, "acc_norm_stderr": 0.03844453181770917 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6666666666666666, "acc_stderr": 0.036155076303109365, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.036155076303109365 }, "harness|truthfulqa:mc|0": { "mc1": 0.26438188494492043, "mc1_stderr": 0.015438211119522514, "mc2": 0.39195277658680794, "mc2_stderr": 0.014470127363546723 }, "harness|winogrande|5": { "acc": 0.7221783741120757, "acc_stderr": 0.012588918183871598 }, "harness|gsm8k|5": { "acc": 0.08263836239575435, "acc_stderr": 0.00758408922014812 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Azazelle__Sina-Odin-7b-Merge
[ "region:us" ]
2024-01-11T02:15:13+00:00
{"pretty_name": "Evaluation run of Azazelle/Sina-Odin-7b-Merge", "dataset_summary": "Dataset automatically created during the evaluation run of model [Azazelle/Sina-Odin-7b-Merge](https://huggingface.co/Azazelle/Sina-Odin-7b-Merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azazelle__Sina-Odin-7b-Merge\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T02:12:52.952838](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__Sina-Odin-7b-Merge/blob/main/results_2024-01-11T02-12-52.952838.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4548612777020411,\n \"acc_stderr\": 0.03415185471656589,\n \"acc_norm\": 0.4605700654166129,\n \"acc_norm_stderr\": 0.03496102721579447,\n \"mc1\": 0.26438188494492043,\n \"mc1_stderr\": 0.015438211119522514,\n \"mc2\": 0.39195277658680794,\n \"mc2_stderr\": 0.014470127363546723\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.492320819112628,\n \"acc_stderr\": 0.014609667440892577,\n \"acc_norm\": 0.5281569965870307,\n \"acc_norm_stderr\": 0.014588204105102203\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.492531368253336,\n \"acc_stderr\": 0.004989224715784536,\n \"acc_norm\": 0.6886078470424218,\n \"acc_norm_stderr\": 0.004621163476949224\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5056603773584906,\n \"acc_stderr\": 0.030770900763851302,\n \"acc_norm\": 0.5056603773584906,\n \"acc_norm_stderr\": 0.030770900763851302\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340356,\n \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340356\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.35964912280701755,\n \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192118,\n \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192118\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342668,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342668\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5193548387096775,\n \"acc_stderr\": 0.0284226874043121,\n \"acc_norm\": 0.5193548387096775,\n \"acc_norm_stderr\": 0.0284226874043121\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.033327690684107895,\n \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.033327690684107895\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603488,\n \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603488\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244441,\n \"acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244441\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6839378238341969,\n \"acc_stderr\": 0.033553973696861736,\n \"acc_norm\": 0.6839378238341969,\n \"acc_norm_stderr\": 0.033553973696861736\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5358974358974359,\n \"acc_stderr\": 0.025285585990017845,\n \"acc_norm\": 0.5358974358974359,\n \"acc_norm_stderr\": 0.025285585990017845\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.025644108639267613,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.025644108639267613\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.47478991596638653,\n \"acc_stderr\": 0.032437180551374095,\n \"acc_norm\": 0.47478991596638653,\n \"acc_norm_stderr\": 0.032437180551374095\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.618348623853211,\n \"acc_stderr\": 0.020828148517022582,\n \"acc_norm\": 0.618348623853211,\n \"acc_norm_stderr\": 0.020828148517022582\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2824074074074074,\n \"acc_stderr\": 0.030701372111510923,\n \"acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.030701372111510923\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.03096451792692341,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.03096451792692341\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.350210970464135,\n \"acc_stderr\": 0.031052391937584353,\n \"acc_norm\": 0.350210970464135,\n \"acc_norm_stderr\": 0.031052391937584353\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.57847533632287,\n \"acc_stderr\": 0.033141902221106564,\n \"acc_norm\": 0.57847533632287,\n \"acc_norm_stderr\": 0.033141902221106564\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.04384140024078016,\n \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.04384140024078016\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7393162393162394,\n \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.7393162393162394,\n \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6615581098339719,\n \"acc_stderr\": 0.01692086958621067,\n \"acc_norm\": 0.6615581098339719,\n \"acc_norm_stderr\": 0.01692086958621067\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4682080924855491,\n \"acc_stderr\": 0.026864624366756656,\n \"acc_norm\": 0.4682080924855491,\n \"acc_norm_stderr\": 0.026864624366756656\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2782122905027933,\n \"acc_stderr\": 0.014987325439963561,\n \"acc_norm\": 0.2782122905027933,\n \"acc_norm_stderr\": 0.014987325439963561\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.028431095444176647,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.028431095444176647\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5273311897106109,\n \"acc_stderr\": 0.028355633568328174,\n \"acc_norm\": 0.5273311897106109,\n \"acc_norm_stderr\": 0.028355633568328174\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5061728395061729,\n \"acc_stderr\": 0.027818623962583295,\n \"acc_norm\": 0.5061728395061729,\n \"acc_norm_stderr\": 0.027818623962583295\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.35815602836879434,\n \"acc_stderr\": 0.02860208586275942,\n \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.02860208586275942\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.01132873440314033,\n \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.01132873440314033\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.40441176470588236,\n \"acc_stderr\": 0.02981263070156974,\n \"acc_norm\": 0.40441176470588236,\n \"acc_norm_stderr\": 0.02981263070156974\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.44281045751633985,\n \"acc_stderr\": 0.020095083154577347,\n \"acc_norm\": 0.44281045751633985,\n \"acc_norm_stderr\": 0.020095083154577347\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3877551020408163,\n \"acc_stderr\": 0.031192230726795656,\n \"acc_norm\": 0.3877551020408163,\n \"acc_norm_stderr\": 0.031192230726795656\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n \"acc_stderr\": 0.034288678487786564,\n \"acc_norm\": 0.6218905472636815,\n \"acc_norm_stderr\": 0.034288678487786564\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.036155076303109365,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.036155076303109365\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26438188494492043,\n \"mc1_stderr\": 0.015438211119522514,\n \"mc2\": 0.39195277658680794,\n \"mc2_stderr\": 0.014470127363546723\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7221783741120757,\n \"acc_stderr\": 0.012588918183871598\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08263836239575435,\n \"acc_stderr\": 0.00758408922014812\n }\n}\n```", "repo_url": "https://huggingface.co/Azazelle/Sina-Odin-7b-Merge", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|arc:challenge|25_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|gsm8k|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hellaswag|10_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T02-12-52.952838.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["**/details_harness|winogrande|5_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T02-12-52.952838.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T02_12_52.952838", "path": ["results_2024-01-11T02-12-52.952838.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T02-12-52.952838.parquet"]}]}]}
2024-01-11T02:15:36+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Azazelle/Sina-Odin-7b-Merge Dataset automatically created during the evaluation run of model Azazelle/Sina-Odin-7b-Merge on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T02:12:52.952838(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Azazelle/Sina-Odin-7b-Merge\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/Sina-Odin-7b-Merge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T02:12:52.952838(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Azazelle/Sina-Odin-7b-Merge\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/Sina-Odin-7b-Merge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T02:12:52.952838(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
431dcaedfe9ccf5884a92fd295c19a75a4e3a6c7
# Dataset Card for Evaluation run of Azazelle/Sina-Loki-7b-Merge <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Azazelle/Sina-Loki-7b-Merge](https://huggingface.co/Azazelle/Sina-Loki-7b-Merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Azazelle__Sina-Loki-7b-Merge", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T02:17:43.890084](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__Sina-Loki-7b-Merge/blob/main/results_2024-01-11T02-17-43.890084.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6454070533261792, "acc_stderr": 0.032016059335499426, "acc_norm": 0.6491116180719122, "acc_norm_stderr": 0.03265180945770308, "mc1": 0.34149326805385555, "mc1_stderr": 0.016600688619950822, "mc2": 0.5384293511918705, "mc2_stderr": 0.01503282265681464 }, "harness|arc:challenge|25": { "acc": 0.5614334470989761, "acc_stderr": 0.014500682618212864, "acc_norm": 0.591296928327645, "acc_norm_stderr": 0.014365750345426996 }, "harness|hellaswag|10": { "acc": 0.6164110734913364, "acc_stderr": 0.004852658876775384, "acc_norm": 0.8195578570005975, "acc_norm_stderr": 0.0038376937398170146 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.04284958639753401, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.04284958639753401 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998905, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998905 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7018867924528301, "acc_stderr": 0.028152837942493868, "acc_norm": 0.7018867924528301, "acc_norm_stderr": 0.028152837942493868 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7916666666666666, "acc_stderr": 0.03396116205845335, "acc_norm": 0.7916666666666666, "acc_norm_stderr": 0.03396116205845335 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909284, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5617021276595745, "acc_stderr": 0.03243618636108101, "acc_norm": 0.5617021276595745, "acc_norm_stderr": 0.03243618636108101 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3968253968253968, "acc_stderr": 0.02519710107424649, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.02519710107424649 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.40476190476190477, "acc_stderr": 0.04390259265377562, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.04390259265377562 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7903225806451613, "acc_stderr": 0.023157879349083525, "acc_norm": 0.7903225806451613, "acc_norm_stderr": 0.023157879349083525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5517241379310345, "acc_stderr": 0.03499113137676744, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.03499113137676744 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7454545454545455, "acc_stderr": 0.03401506715249039, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7676767676767676, "acc_stderr": 0.03008862949021749, "acc_norm": 0.7676767676767676, "acc_norm_stderr": 0.03008862949021749 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8652849740932642, "acc_stderr": 0.02463978909770944, "acc_norm": 0.8652849740932642, "acc_norm_stderr": 0.02463978909770944 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.028317533496066482, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.028317533496066482 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7058823529411765, "acc_stderr": 0.029597329730978086, "acc_norm": 0.7058823529411765, "acc_norm_stderr": 0.029597329730978086 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.015630022970092448, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.015630022970092448 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5370370370370371, "acc_stderr": 0.03400603625538272, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.03400603625538272 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8235294117647058, "acc_stderr": 0.026756401538078962, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.026756401538078962 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8354430379746836, "acc_stderr": 0.024135736240566932, "acc_norm": 0.8354430379746836, "acc_norm_stderr": 0.024135736240566932 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8425925925925926, "acc_stderr": 0.03520703990517963, "acc_norm": 0.8425925925925926, "acc_norm_stderr": 0.03520703990517963 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.803680981595092, "acc_stderr": 0.031207970394709218, "acc_norm": 0.803680981595092, "acc_norm_stderr": 0.031207970394709218 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.8446601941747572, "acc_stderr": 0.03586594738573974, "acc_norm": 0.8446601941747572, "acc_norm_stderr": 0.03586594738573974 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.022209309073165612, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.022209309073165612 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8237547892720306, "acc_stderr": 0.013625556907993457, "acc_norm": 0.8237547892720306, "acc_norm_stderr": 0.013625556907993457 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6936416184971098, "acc_stderr": 0.024818350129436593, "acc_norm": 0.6936416184971098, "acc_norm_stderr": 0.024818350129436593 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2994413407821229, "acc_stderr": 0.015318257745976706, "acc_norm": 0.2994413407821229, "acc_norm_stderr": 0.015318257745976706 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7352941176470589, "acc_stderr": 0.02526169121972948, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.02526169121972948 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.025670259242188936, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.025670259242188936 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7222222222222222, "acc_stderr": 0.024922001168886335, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.024922001168886335 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4574468085106383, "acc_stderr": 0.02971928127223684, "acc_norm": 0.4574468085106383, "acc_norm_stderr": 0.02971928127223684 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.45371577574967403, "acc_stderr": 0.012715404841277738, "acc_norm": 0.45371577574967403, "acc_norm_stderr": 0.012715404841277738 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6911764705882353, "acc_stderr": 0.02806499816704009, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.02806499816704009 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.01897542792050721, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.01897542792050721 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6979591836734694, "acc_stderr": 0.0293936093198798, "acc_norm": 0.6979591836734694, "acc_norm_stderr": 0.0293936093198798 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.025538433368578334, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.025538433368578334 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.89, "acc_stderr": 0.03144660377352203, "acc_norm": 0.89, "acc_norm_stderr": 0.03144660377352203 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8187134502923976, "acc_stderr": 0.029547741687640044, "acc_norm": 0.8187134502923976, "acc_norm_stderr": 0.029547741687640044 }, "harness|truthfulqa:mc|0": { "mc1": 0.34149326805385555, "mc1_stderr": 0.016600688619950822, "mc2": 0.5384293511918705, "mc2_stderr": 0.01503282265681464 }, "harness|winogrande|5": { "acc": 0.7813733228097869, "acc_stderr": 0.01161619821577323 }, "harness|gsm8k|5": { "acc": 0.5238817285822593, "acc_stderr": 0.013756765835465751 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Azazelle__Sina-Loki-7b-Merge
[ "region:us" ]
2024-01-11T02:20:02+00:00
{"pretty_name": "Evaluation run of Azazelle/Sina-Loki-7b-Merge", "dataset_summary": "Dataset automatically created during the evaluation run of model [Azazelle/Sina-Loki-7b-Merge](https://huggingface.co/Azazelle/Sina-Loki-7b-Merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azazelle__Sina-Loki-7b-Merge\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T02:17:43.890084](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__Sina-Loki-7b-Merge/blob/main/results_2024-01-11T02-17-43.890084.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6454070533261792,\n \"acc_stderr\": 0.032016059335499426,\n \"acc_norm\": 0.6491116180719122,\n \"acc_norm_stderr\": 0.03265180945770308,\n \"mc1\": 0.34149326805385555,\n \"mc1_stderr\": 0.016600688619950822,\n \"mc2\": 0.5384293511918705,\n \"mc2_stderr\": 0.01503282265681464\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5614334470989761,\n \"acc_stderr\": 0.014500682618212864,\n \"acc_norm\": 0.591296928327645,\n \"acc_norm_stderr\": 0.014365750345426996\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6164110734913364,\n \"acc_stderr\": 0.004852658876775384,\n \"acc_norm\": 0.8195578570005975,\n \"acc_norm_stderr\": 0.0038376937398170146\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998905,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998905\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493868,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493868\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n \"acc_stderr\": 0.03396116205845335,\n \"acc_norm\": 0.7916666666666666,\n \"acc_norm_stderr\": 0.03396116205845335\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.03499113137676744,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.03499113137676744\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.029597329730978086,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.029597329730978086\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092448,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092448\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8354430379746836,\n \"acc_stderr\": 0.024135736240566932,\n \"acc_norm\": 0.8354430379746836,\n \"acc_norm_stderr\": 0.024135736240566932\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2994413407821229,\n \"acc_stderr\": 0.015318257745976706,\n \"acc_norm\": 0.2994413407821229,\n \"acc_norm_stderr\": 0.015318257745976706\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.02971928127223684,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.02971928127223684\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n \"acc_stderr\": 0.012715404841277738,\n \"acc_norm\": 0.45371577574967403,\n \"acc_norm_stderr\": 0.012715404841277738\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34149326805385555,\n \"mc1_stderr\": 0.016600688619950822,\n \"mc2\": 0.5384293511918705,\n \"mc2_stderr\": 0.01503282265681464\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.01161619821577323\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5238817285822593,\n \"acc_stderr\": 0.013756765835465751\n }\n}\n```", "repo_url": "https://huggingface.co/Azazelle/Sina-Loki-7b-Merge", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|arc:challenge|25_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|gsm8k|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hellaswag|10_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T02-17-43.890084.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["**/details_harness|winogrande|5_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T02-17-43.890084.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T02_17_43.890084", "path": ["results_2024-01-11T02-17-43.890084.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T02-17-43.890084.parquet"]}]}]}
2024-01-11T02:20:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Azazelle/Sina-Loki-7b-Merge Dataset automatically created during the evaluation run of model Azazelle/Sina-Loki-7b-Merge on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T02:17:43.890084(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Azazelle/Sina-Loki-7b-Merge\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/Sina-Loki-7b-Merge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T02:17:43.890084(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Azazelle/Sina-Loki-7b-Merge\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/Sina-Loki-7b-Merge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T02:17:43.890084(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
2b4b602aa631155a4b0bd2d21f2d712515cd1634
# AboutMe: Self-Descriptions in Webpages ## Dataset description **Curated by:** Li Lucy, Suchin Gururangan, Luca Soldaini, Emma Strubell, David Bamman, Lauren Klein, Jesse Dodge **Languages:** English **License:** AI2 ImpACT License - Low Risk Artifacts **Paper:** [https://arxiv.org/abs/2401.06408](https://arxiv.org/abs/2401.06408) ## Dataset sources Common Crawl ## Uses This dataset was originally created to document the effects of different pretraining data curation practices. It is intended for research use, e.g. AI evaluation and analysis of development pipelines or social scientific research of Internet communities and self-presentation. ## Dataset structure This dataset consists of three parts: - `about_pages`: webpages that are self-descriptions and profiles of website creators, or text *about* individuals and organizations on the web. These are zipped files with one json per line, with the following keys: - `url` - `hostname` - `cc_segment` (for tracking where in Common Crawl the page is originally retrieved from) - `text` - `title` (webpage title) - `sampled_pages`: random webpages from the same set of websites, or text created or curated *by* individuals and organizations on the web. It has the same keys as `about_pages`. - `about_pages_meta`: algorithmically extracted information from "About" pages, including: - `hn`: hostname of website - `country`: the most frequent country of locations on the page, obtained using Mordecai3 geoparsing - `roles`: social roles and occupations detected using RoBERTa based on expressions of self-identification, e.g. *I am a **dancer***. Each role is accompanied by sentence number and start/end character offsets. - `class`: whether the page is detected to be an individual or organization - `cluster`: one of fifty topical labels obtained via tf-idf clustering of "about" pages Each file contains one json entry per line. Note that the entries in each file are not in a random order, but instead reflect an ordering outputted by CCNet (e.g. neighboring pages may be similar in Wikipedia-based perplexity.) ## Dataset creation AboutMe is derived from twenty four snapshots of Common Crawl collected between 2020–05 and 2023–06. We extract text from raw Common Crawl using CCNet, and deduplicate URLs across all snapshots. We only include text that has a fastText English score > 0.5. "About" pages are identified using keywords in URLs (about, about-me, about-us, and bio), and their URLs end in `/keyword/` or `keyword.*`, e.g. `about.html`. We only include pages that have one candidate URL, to avoid ambiguity around which page is actually about the main website creator. If a webpage has both `https` and `http` versions in Common Crawl, we take the `https` version. The "sampled" pages are a single webpage randomly sampled from the website that has an "about" page. More details on metadata creation can be found in our paper, linked above. ## Bias, Risks, and Limitations Algorithmic measurements of textual content is scalable, but imperfect. We acknowledge that our dataset and analysis methods (e.g. classification, information retrieval) can also uphold language norms and standards that may disproportionately affect some social groups over others. We hope that future work continues to improve these content analysis pipelines, especially for long-tail or minoritized language phenomena. We encourage future work using our dataset to minimize the extent to which they infer unlabeled or implicit information about subjects in this dataset, and to assess the risks of inferring various types of information from these pages. In addition, measurements of social identities from AboutMe pages are affected by reporting bias. Future uses of this data should avoid incorporating personally identifiable information into generative models, report only aggregated results, and paraphrase quoted examples in papers to protect the privacy of subjects. ## Citation ``` @misc{lucy2024aboutme, title={AboutMe: Using Self-Descriptions in Webpages to Document the Effects of English Pretraining Data Filters}, author={Li Lucy and Suchin Gururangan and Luca Soldaini and Emma Strubell and David Bamman and Lauren Klein and Jesse Dodge}, year={2024}, eprint={2401.06408}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ## Dataset contact [email protected]
allenai/aboutme
[ "size_categories:10M<n<100M", "language:en", "license:other", "common crawl", "webtext", "social nlp", "arxiv:2401.06408", "region:us" ]
2024-01-11T02:22:22+00:00
{"language": ["en"], "license": "other", "size_categories": ["10M<n<100M"], "pretty_name": "AboutMe", "tags": ["common crawl", "webtext", "social nlp"], "extra_gated_prompt": "Access to this dataset is automatically granted upon accepting the [**AI2 ImpACT License - Low Risk Artifacts (\u201cLR Agreement\u201d)**](https://allenai.org/licenses/impact-lr) and completing all fields below.", "extra_gated_fields": {"Your full name": "text", "Organization or entity you are affiliated with": "text", "State or country you are located in": "text", "Contact email": "text", "Please describe your intended use of the medium risk artifact(s)": "text", "I AGREE to the terms and conditions of the MR Agreement above": "checkbox", "I AGREE to AI2\u2019s use of my information for legal notices and administrative matters": "checkbox", "I CERTIFY that the information I have provided is true and accurate": "checkbox"}}
2024-01-16T19:26:57+00:00
[ "2401.06408" ]
[ "en" ]
TAGS #size_categories-10M<n<100M #language-English #license-other #common crawl #webtext #social nlp #arxiv-2401.06408 #region-us
# AboutMe: Self-Descriptions in Webpages ## Dataset description Curated by: Li Lucy, Suchin Gururangan, Luca Soldaini, Emma Strubell, David Bamman, Lauren Klein, Jesse Dodge Languages: English License: AI2 ImpACT License - Low Risk Artifacts Paper: URL ## Dataset sources Common Crawl ## Uses This dataset was originally created to document the effects of different pretraining data curation practices. It is intended for research use, e.g. AI evaluation and analysis of development pipelines or social scientific research of Internet communities and self-presentation. ## Dataset structure This dataset consists of three parts: - 'about_pages': webpages that are self-descriptions and profiles of website creators, or text *about* individuals and organizations on the web. These are zipped files with one json per line, with the following keys: - 'url' - 'hostname' - 'cc_segment' (for tracking where in Common Crawl the page is originally retrieved from) - 'text' - 'title' (webpage title) - 'sampled_pages': random webpages from the same set of websites, or text created or curated *by* individuals and organizations on the web. It has the same keys as 'about_pages'. - 'about_pages_meta': algorithmically extracted information from "About" pages, including: - 'hn': hostname of website - 'country': the most frequent country of locations on the page, obtained using Mordecai3 geoparsing - 'roles': social roles and occupations detected using RoBERTa based on expressions of self-identification, e.g. *I am a dancer*. Each role is accompanied by sentence number and start/end character offsets. - 'class': whether the page is detected to be an individual or organization - 'cluster': one of fifty topical labels obtained via tf-idf clustering of "about" pages Each file contains one json entry per line. Note that the entries in each file are not in a random order, but instead reflect an ordering outputted by CCNet (e.g. neighboring pages may be similar in Wikipedia-based perplexity.) ## Dataset creation AboutMe is derived from twenty four snapshots of Common Crawl collected between 2020–05 and 2023–06. We extract text from raw Common Crawl using CCNet, and deduplicate URLs across all snapshots. We only include text that has a fastText English score > 0.5. "About" pages are identified using keywords in URLs (about, about-me, about-us, and bio), and their URLs end in '/keyword/' or 'keyword.*', e.g. 'URL'. We only include pages that have one candidate URL, to avoid ambiguity around which page is actually about the main website creator. If a webpage has both 'https' and 'http' versions in Common Crawl, we take the 'https' version. The "sampled" pages are a single webpage randomly sampled from the website that has an "about" page. More details on metadata creation can be found in our paper, linked above. ## Bias, Risks, and Limitations Algorithmic measurements of textual content is scalable, but imperfect. We acknowledge that our dataset and analysis methods (e.g. classification, information retrieval) can also uphold language norms and standards that may disproportionately affect some social groups over others. We hope that future work continues to improve these content analysis pipelines, especially for long-tail or minoritized language phenomena. We encourage future work using our dataset to minimize the extent to which they infer unlabeled or implicit information about subjects in this dataset, and to assess the risks of inferring various types of information from these pages. In addition, measurements of social identities from AboutMe pages are affected by reporting bias. Future uses of this data should avoid incorporating personally identifiable information into generative models, report only aggregated results, and paraphrase quoted examples in papers to protect the privacy of subjects. ## Dataset contact lucy3_li@URL
[ "# AboutMe: Self-Descriptions in Webpages", "## Dataset description\n\nCurated by: Li Lucy, Suchin Gururangan, Luca Soldaini, Emma Strubell, David Bamman, Lauren Klein, Jesse Dodge\n\nLanguages: English\n\nLicense: AI2 ImpACT License - Low Risk Artifacts\n\nPaper: URL", "## Dataset sources\n\nCommon Crawl", "## Uses\n\nThis dataset was originally created to document the effects of different pretraining data curation practices. It is intended for research use, e.g. AI evaluation and analysis of development pipelines or social scientific research of Internet communities and self-presentation.", "## Dataset structure\n\nThis dataset consists of three parts: \n- 'about_pages': webpages that are self-descriptions and profiles of website creators, or text *about* individuals and organizations on the web. These are zipped files with one json per line, with the following keys:\n - 'url'\n - 'hostname'\n - 'cc_segment' (for tracking where in Common Crawl the page is originally retrieved from)\n - 'text'\n - 'title' (webpage title)\n- 'sampled_pages': random webpages from the same set of websites, or text created or curated *by* individuals and organizations on the web. It has the same keys as 'about_pages'.\n- 'about_pages_meta': algorithmically extracted information from \"About\" pages, including:\n - 'hn': hostname of website\n - 'country': the most frequent country of locations on the page, obtained using Mordecai3 geoparsing\n - 'roles': social roles and occupations detected using RoBERTa based on expressions of self-identification, e.g. *I am a dancer*. Each role is accompanied by sentence number and start/end character offsets. \n - 'class': whether the page is detected to be an individual or organization\n - 'cluster': one of fifty topical labels obtained via tf-idf clustering of \"about\" pages\n \nEach file contains one json entry per line. Note that the entries in each file are not in a random order, but instead reflect an ordering outputted by CCNet (e.g. neighboring pages may be similar in Wikipedia-based perplexity.)", "## Dataset creation\n\nAboutMe is derived from twenty four snapshots of Common Crawl collected between 2020–05 and 2023–06. We extract text from raw Common Crawl using CCNet, and deduplicate URLs across all snapshots. We only include text that has a fastText English score > 0.5. \"About\" pages are identified using keywords in URLs (about, about-me, about-us, and bio), and their URLs end in '/keyword/' or 'keyword.*', e.g. 'URL'. We only include pages that have one candidate URL, to avoid ambiguity around which page is actually about the main website creator. If a webpage has both 'https' and 'http' versions in Common Crawl, we take the 'https' version. The \"sampled\" pages are a single webpage randomly sampled from the website that has an \"about\" page.\n\nMore details on metadata creation can be found in our paper, linked above.", "## Bias, Risks, and Limitations\n\nAlgorithmic measurements of textual content is scalable, but imperfect. We acknowledge that our dataset and analysis methods (e.g. classification, information retrieval) can also uphold language norms and standards that may disproportionately affect some social groups over others. We hope that future work continues to improve these content analysis pipelines, especially for long-tail or minoritized language phenomena.\n\nWe encourage future work using our dataset to minimize the extent to which they infer unlabeled or implicit information about subjects in this dataset, and to assess the risks of inferring various types of information from these pages. In addition, measurements of social identities from AboutMe pages are affected by reporting bias. \n\nFuture uses of this data should avoid incorporating personally identifiable information into generative models, report only aggregated results, and paraphrase quoted examples in papers to protect the privacy of subjects.", "## Dataset contact\n\nlucy3_li@URL" ]
[ "TAGS\n#size_categories-10M<n<100M #language-English #license-other #common crawl #webtext #social nlp #arxiv-2401.06408 #region-us \n", "# AboutMe: Self-Descriptions in Webpages", "## Dataset description\n\nCurated by: Li Lucy, Suchin Gururangan, Luca Soldaini, Emma Strubell, David Bamman, Lauren Klein, Jesse Dodge\n\nLanguages: English\n\nLicense: AI2 ImpACT License - Low Risk Artifacts\n\nPaper: URL", "## Dataset sources\n\nCommon Crawl", "## Uses\n\nThis dataset was originally created to document the effects of different pretraining data curation practices. It is intended for research use, e.g. AI evaluation and analysis of development pipelines or social scientific research of Internet communities and self-presentation.", "## Dataset structure\n\nThis dataset consists of three parts: \n- 'about_pages': webpages that are self-descriptions and profiles of website creators, or text *about* individuals and organizations on the web. These are zipped files with one json per line, with the following keys:\n - 'url'\n - 'hostname'\n - 'cc_segment' (for tracking where in Common Crawl the page is originally retrieved from)\n - 'text'\n - 'title' (webpage title)\n- 'sampled_pages': random webpages from the same set of websites, or text created or curated *by* individuals and organizations on the web. It has the same keys as 'about_pages'.\n- 'about_pages_meta': algorithmically extracted information from \"About\" pages, including:\n - 'hn': hostname of website\n - 'country': the most frequent country of locations on the page, obtained using Mordecai3 geoparsing\n - 'roles': social roles and occupations detected using RoBERTa based on expressions of self-identification, e.g. *I am a dancer*. Each role is accompanied by sentence number and start/end character offsets. \n - 'class': whether the page is detected to be an individual or organization\n - 'cluster': one of fifty topical labels obtained via tf-idf clustering of \"about\" pages\n \nEach file contains one json entry per line. Note that the entries in each file are not in a random order, but instead reflect an ordering outputted by CCNet (e.g. neighboring pages may be similar in Wikipedia-based perplexity.)", "## Dataset creation\n\nAboutMe is derived from twenty four snapshots of Common Crawl collected between 2020–05 and 2023–06. We extract text from raw Common Crawl using CCNet, and deduplicate URLs across all snapshots. We only include text that has a fastText English score > 0.5. \"About\" pages are identified using keywords in URLs (about, about-me, about-us, and bio), and their URLs end in '/keyword/' or 'keyword.*', e.g. 'URL'. We only include pages that have one candidate URL, to avoid ambiguity around which page is actually about the main website creator. If a webpage has both 'https' and 'http' versions in Common Crawl, we take the 'https' version. The \"sampled\" pages are a single webpage randomly sampled from the website that has an \"about\" page.\n\nMore details on metadata creation can be found in our paper, linked above.", "## Bias, Risks, and Limitations\n\nAlgorithmic measurements of textual content is scalable, but imperfect. We acknowledge that our dataset and analysis methods (e.g. classification, information retrieval) can also uphold language norms and standards that may disproportionately affect some social groups over others. We hope that future work continues to improve these content analysis pipelines, especially for long-tail or minoritized language phenomena.\n\nWe encourage future work using our dataset to minimize the extent to which they infer unlabeled or implicit information about subjects in this dataset, and to assess the risks of inferring various types of information from these pages. In addition, measurements of social identities from AboutMe pages are affected by reporting bias. \n\nFuture uses of this data should avoid incorporating personally identifiable information into generative models, report only aggregated results, and paraphrase quoted examples in papers to protect the privacy of subjects.", "## Dataset contact\n\nlucy3_li@URL" ]
5771ada211db135646aaddd10005b6e4e0118bb7
# Dataset Card for Evaluation run of Xenon1/MetaModel_moex8 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Xenon1/MetaModel_moex8](https://huggingface.co/Xenon1/MetaModel_moex8) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Xenon1__MetaModel_moex8", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T02:53:29.690777](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__MetaModel_moex8/blob/main/results_2024-01-11T02-53-29.690777.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6666685509187187, "acc_stderr": 0.031624554326251955, "acc_norm": 0.6674538314797805, "acc_norm_stderr": 0.03226818453028863, "mc1": 0.5703794369645043, "mc1_stderr": 0.017329234580409095, "mc2": 0.7190575993623916, "mc2_stderr": 0.015005319709933473 }, "harness|arc:challenge|25": { "acc": 0.6825938566552902, "acc_stderr": 0.013602239088038167, "acc_norm": 0.71160409556314, "acc_norm_stderr": 0.013238394422428173 }, "harness|hellaswag|10": { "acc": 0.7136028679545907, "acc_stderr": 0.004511533039406212, "acc_norm": 0.883788090021908, "acc_norm_stderr": 0.00319823895181762 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.756578947368421, "acc_stderr": 0.034923496688842384, "acc_norm": 0.756578947368421, "acc_norm_stderr": 0.034923496688842384 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.74, "acc_stderr": 0.0440844002276808, "acc_norm": 0.74, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6830188679245283, "acc_stderr": 0.02863723563980089, "acc_norm": 0.6830188679245283, "acc_norm_stderr": 0.02863723563980089 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956913, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956913 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107223, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107223 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.625531914893617, "acc_stderr": 0.03163910665367291, "acc_norm": 0.625531914893617, "acc_norm_stderr": 0.03163910665367291 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6344827586206897, "acc_stderr": 0.040131241954243856, "acc_norm": 0.6344827586206897, "acc_norm_stderr": 0.040131241954243856 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4947089947089947, "acc_stderr": 0.02574986828855657, "acc_norm": 0.4947089947089947, "acc_norm_stderr": 0.02574986828855657 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42857142857142855, "acc_stderr": 0.0442626668137991, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.0442626668137991 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8193548387096774, "acc_stderr": 0.021886178567172534, "acc_norm": 0.8193548387096774, "acc_norm_stderr": 0.021886178567172534 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.03517945038691063, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8121212121212121, "acc_stderr": 0.03050193405942914, "acc_norm": 0.8121212121212121, "acc_norm_stderr": 0.03050193405942914 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8686868686868687, "acc_stderr": 0.024063156416822516, "acc_norm": 0.8686868686868687, "acc_norm_stderr": 0.024063156416822516 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.02150024957603348, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.02150024957603348 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6641025641025641, "acc_stderr": 0.023946724741563976, "acc_norm": 0.6641025641025641, "acc_norm_stderr": 0.023946724741563976 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.36666666666666664, "acc_stderr": 0.029381620726465073, "acc_norm": 0.36666666666666664, "acc_norm_stderr": 0.029381620726465073 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7142857142857143, "acc_stderr": 0.029344572500634332, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.029344572500634332 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.03958027231121569, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.03958027231121569 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374308, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374308 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5740740740740741, "acc_stderr": 0.03372343271653062, "acc_norm": 0.5740740740740741, "acc_norm_stderr": 0.03372343271653062 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8578431372549019, "acc_stderr": 0.02450980392156862, "acc_norm": 0.8578431372549019, "acc_norm_stderr": 0.02450980392156862 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8481012658227848, "acc_stderr": 0.023363878096632446, "acc_norm": 0.8481012658227848, "acc_norm_stderr": 0.023363878096632446 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.03149384670994131, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.03149384670994131 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7557251908396947, "acc_stderr": 0.03768335959728743, "acc_norm": 0.7557251908396947, "acc_norm_stderr": 0.03768335959728743 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.033519538795212696, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.033519538795212696 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.8543689320388349, "acc_stderr": 0.03492606476623791, "acc_norm": 0.8543689320388349, "acc_norm_stderr": 0.03492606476623791 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.02280138253459753, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.02280138253459753 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8058748403575989, "acc_stderr": 0.014143970276657569, "acc_norm": 0.8058748403575989, "acc_norm_stderr": 0.014143970276657569 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7543352601156069, "acc_stderr": 0.023176298203992005, "acc_norm": 0.7543352601156069, "acc_norm_stderr": 0.023176298203992005 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.394413407821229, "acc_stderr": 0.01634538676210397, "acc_norm": 0.394413407821229, "acc_norm_stderr": 0.01634538676210397 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.761437908496732, "acc_stderr": 0.024404394928087866, "acc_norm": 0.761437908496732, "acc_norm_stderr": 0.024404394928087866 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7266881028938906, "acc_stderr": 0.025311765975426122, "acc_norm": 0.7266881028938906, "acc_norm_stderr": 0.025311765975426122 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7839506172839507, "acc_stderr": 0.022899162918445806, "acc_norm": 0.7839506172839507, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4921773142112125, "acc_stderr": 0.0127686730761119, "acc_norm": 0.4921773142112125, "acc_norm_stderr": 0.0127686730761119 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7389705882352942, "acc_stderr": 0.026679252270103128, "acc_norm": 0.7389705882352942, "acc_norm_stderr": 0.026679252270103128 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6781045751633987, "acc_stderr": 0.018901015322093092, "acc_norm": 0.6781045751633987, "acc_norm_stderr": 0.018901015322093092 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.028123429335142783, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.028123429335142783 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.02650859065623327, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.02650859065623327 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.9, "acc_stderr": 0.030151134457776334, "acc_norm": 0.9, "acc_norm_stderr": 0.030151134457776334 }, "harness|hendrycksTest-virology|5": { "acc": 0.5843373493975904, "acc_stderr": 0.03836722176598053, "acc_norm": 0.5843373493975904, "acc_norm_stderr": 0.03836722176598053 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.783625730994152, "acc_stderr": 0.03158149539338733, "acc_norm": 0.783625730994152, "acc_norm_stderr": 0.03158149539338733 }, "harness|truthfulqa:mc|0": { "mc1": 0.5703794369645043, "mc1_stderr": 0.017329234580409095, "mc2": 0.7190575993623916, "mc2_stderr": 0.015005319709933473 }, "harness|winogrande|5": { "acc": 0.8326756116811366, "acc_stderr": 0.010490608806828075 }, "harness|gsm8k|5": { "acc": 0.6535253980288097, "acc_stderr": 0.013107179054313403 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Xenon1__MetaModel_moex8
[ "region:us" ]
2024-01-11T02:55:48+00:00
{"pretty_name": "Evaluation run of Xenon1/MetaModel_moex8", "dataset_summary": "Dataset automatically created during the evaluation run of model [Xenon1/MetaModel_moex8](https://huggingface.co/Xenon1/MetaModel_moex8) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xenon1__MetaModel_moex8\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T02:53:29.690777](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__MetaModel_moex8/blob/main/results_2024-01-11T02-53-29.690777.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6666685509187187,\n \"acc_stderr\": 0.031624554326251955,\n \"acc_norm\": 0.6674538314797805,\n \"acc_norm_stderr\": 0.03226818453028863,\n \"mc1\": 0.5703794369645043,\n \"mc1_stderr\": 0.017329234580409095,\n \"mc2\": 0.7190575993623916,\n \"mc2_stderr\": 0.015005319709933473\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6825938566552902,\n \"acc_stderr\": 0.013602239088038167,\n \"acc_norm\": 0.71160409556314,\n \"acc_norm_stderr\": 0.013238394422428173\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7136028679545907,\n \"acc_stderr\": 0.004511533039406212,\n \"acc_norm\": 0.883788090021908,\n \"acc_norm_stderr\": 0.00319823895181762\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4947089947089947,\n \"acc_stderr\": 0.02574986828855657,\n \"acc_norm\": 0.4947089947089947,\n \"acc_norm_stderr\": 0.02574986828855657\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n \"acc_stderr\": 0.021886178567172534,\n \"acc_norm\": 0.8193548387096774,\n \"acc_norm_stderr\": 0.021886178567172534\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n \"acc_stderr\": 0.014143970276657569,\n \"acc_norm\": 0.8058748403575989,\n \"acc_norm_stderr\": 0.014143970276657569\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992005,\n \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992005\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.394413407821229,\n \"acc_stderr\": 0.01634538676210397,\n \"acc_norm\": 0.394413407821229,\n \"acc_norm_stderr\": 0.01634538676210397\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.024404394928087866,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.024404394928087866\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4921773142112125,\n \"acc_stderr\": 0.0127686730761119,\n \"acc_norm\": 0.4921773142112125,\n \"acc_norm_stderr\": 0.0127686730761119\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338733,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338733\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5703794369645043,\n \"mc1_stderr\": 0.017329234580409095,\n \"mc2\": 0.7190575993623916,\n \"mc2_stderr\": 0.015005319709933473\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6535253980288097,\n \"acc_stderr\": 0.013107179054313403\n }\n}\n```", "repo_url": "https://huggingface.co/Xenon1/MetaModel_moex8", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|arc:challenge|25_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|gsm8k|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hellaswag|10_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T02-53-29.690777.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["**/details_harness|winogrande|5_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T02-53-29.690777.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T02_53_29.690777", "path": ["results_2024-01-11T02-53-29.690777.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T02-53-29.690777.parquet"]}]}]}
2024-01-11T02:56:08+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Xenon1/MetaModel_moex8 Dataset automatically created during the evaluation run of model Xenon1/MetaModel_moex8 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T02:53:29.690777(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Xenon1/MetaModel_moex8\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/MetaModel_moex8 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T02:53:29.690777(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Xenon1/MetaModel_moex8\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/MetaModel_moex8 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T02:53:29.690777(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
a9c8bd1ad449a8e29f72086f3aca247744d22fb8
# Llama 2 Vietnamese dataset Bộ dữ liệu Alpaca được dịch sang tiếng Việt theo chuẩn [Llama 2 Prompt](https://gpus.llm-utils.org/llama-2-prompt-template/). ## Prompt template ```plain <s>[INST] <<SYS>> {system_message} <</SYS>> {user_message_1} [/INST] {model_reply_1}</s><s>[INST] {user_message_2} [/INST] ``` ## Tác giả - [Iambestfeed](https://github.com/Iambestfeed) - [Alex Nguyen](https://github.com/tiendung) - [Thanh Trần](https://github.com/ging-dev)
gingdev/llama_vi_52k
[ "task_categories:question-answering", "task_categories:text-generation", "size_categories:10K<n<100K", "language:vi", "license:mit", "region:us" ]
2024-01-11T03:29:31+00:00
{"language": ["vi"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering", "text-generation"], "pretty_name": "ging llama 2 dataset", "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 26089653, "num_examples": 51092}], "download_size": 12905007, "dataset_size": 26089653}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-13T04:08:11+00:00
[]
[ "vi" ]
TAGS #task_categories-question-answering #task_categories-text-generation #size_categories-10K<n<100K #language-Vietnamese #license-mit #region-us
# Llama 2 Vietnamese dataset Bộ dữ liệu Alpaca được dịch sang tiếng Việt theo chuẩn Llama 2 Prompt. ## Prompt template ## Tác giả - Iambestfeed - Alex Nguyen - Thanh Trần
[ "# Llama 2 Vietnamese dataset\n\nBộ dữ liệu Alpaca được dịch sang tiếng Việt theo chuẩn Llama 2 Prompt.", "## Prompt template", "## Tác giả\n- Iambestfeed\n- Alex Nguyen\n- Thanh Trần" ]
[ "TAGS\n#task_categories-question-answering #task_categories-text-generation #size_categories-10K<n<100K #language-Vietnamese #license-mit #region-us \n", "# Llama 2 Vietnamese dataset\n\nBộ dữ liệu Alpaca được dịch sang tiếng Việt theo chuẩn Llama 2 Prompt.", "## Prompt template", "## Tác giả\n- Iambestfeed\n- Alex Nguyen\n- Thanh Trần" ]
af7d640666eb77f2118e09c5baaa133107deee0c
# Dataset Card for Evaluation run of decruz07/kellemar-DPO-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [decruz07/kellemar-DPO-7B](https://huggingface.co/decruz07/kellemar-DPO-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_decruz07__kellemar-DPO-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T06:58:28.496498](https://huggingface.co/datasets/open-llm-leaderboard/details_decruz07__kellemar-DPO-7B/blob/main/results_2024-01-11T06-58-28.496498.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6364735443577895, "acc_stderr": 0.032271218354351386, "acc_norm": 0.6383021875744406, "acc_norm_stderr": 0.03291695999578323, "mc1": 0.3818849449204406, "mc1_stderr": 0.017008101939163495, "mc2": 0.5554845037946894, "mc2_stderr": 0.015378736360042876 }, "harness|arc:challenge|25": { "acc": 0.621160409556314, "acc_stderr": 0.014175915490000326, "acc_norm": 0.6604095563139932, "acc_norm_stderr": 0.013839039762820169 }, "harness|hellaswag|10": { "acc": 0.6629157538338977, "acc_stderr": 0.004717478335689633, "acc_norm": 0.8521210914160526, "acc_norm_stderr": 0.0035425443194051416 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5851851851851851, "acc_stderr": 0.04256193767901408, "acc_norm": 0.5851851851851851, "acc_norm_stderr": 0.04256193767901408 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119668, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119668 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6867924528301886, "acc_stderr": 0.028544793319055326, "acc_norm": 0.6867924528301886, "acc_norm_stderr": 0.028544793319055326 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6127167630057804, "acc_stderr": 0.03714325906302065, "acc_norm": 0.6127167630057804, "acc_norm_stderr": 0.03714325906302065 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.048580835742663454, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.048580835742663454 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5446808510638298, "acc_stderr": 0.032555253593403555, "acc_norm": 0.5446808510638298, "acc_norm_stderr": 0.032555253593403555 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.04697085136647863, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.04697085136647863 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5241379310344828, "acc_stderr": 0.0416180850350153, "acc_norm": 0.5241379310344828, "acc_norm_stderr": 0.0416180850350153 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.025355741263055266, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.025355741263055266 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7967741935483871, "acc_stderr": 0.02289168798455495, "acc_norm": 0.7967741935483871, "acc_norm_stderr": 0.02289168798455495 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.67, "acc_stderr": 0.04725815626252607, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252607 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.03192271569548301, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.03192271569548301 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.029376616484945633, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.029376616484945633 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8808290155440415, "acc_stderr": 0.023381935348121437, "acc_norm": 0.8808290155440415, "acc_norm_stderr": 0.023381935348121437 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6153846153846154, "acc_stderr": 0.024666744915187208, "acc_norm": 0.6153846153846154, "acc_norm_stderr": 0.024666744915187208 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3, "acc_stderr": 0.027940457136228402, "acc_norm": 0.3, "acc_norm_stderr": 0.027940457136228402 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886797, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886797 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8256880733944955, "acc_stderr": 0.016265675632010344, "acc_norm": 0.8256880733944955, "acc_norm_stderr": 0.016265675632010344 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.03408655867977749, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8186274509803921, "acc_stderr": 0.027044621719474082, "acc_norm": 0.8186274509803921, "acc_norm_stderr": 0.027044621719474082 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8143459915611815, "acc_stderr": 0.025310495376944856, "acc_norm": 0.8143459915611815, "acc_norm_stderr": 0.025310495376944856 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.03076935200822914, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.03076935200822914 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.03941897526516304, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.03941897526516304 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742178, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742178 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489123, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8632478632478633, "acc_stderr": 0.022509033937077805, "acc_norm": 0.8632478632478633, "acc_norm_stderr": 0.022509033937077805 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8212005108556832, "acc_stderr": 0.013702643715368983, "acc_norm": 0.8212005108556832, "acc_norm_stderr": 0.013702643715368983 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7167630057803468, "acc_stderr": 0.02425790170532338, "acc_norm": 0.7167630057803468, "acc_norm_stderr": 0.02425790170532338 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.33854748603351953, "acc_stderr": 0.01582670009648135, "acc_norm": 0.33854748603351953, "acc_norm_stderr": 0.01582670009648135 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.761437908496732, "acc_stderr": 0.02440439492808787, "acc_norm": 0.761437908496732, "acc_norm_stderr": 0.02440439492808787 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6881028938906752, "acc_stderr": 0.02631185807185416, "acc_norm": 0.6881028938906752, "acc_norm_stderr": 0.02631185807185416 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7530864197530864, "acc_stderr": 0.02399350170904211, "acc_norm": 0.7530864197530864, "acc_norm_stderr": 0.02399350170904211 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5106382978723404, "acc_stderr": 0.02982074719142244, "acc_norm": 0.5106382978723404, "acc_norm_stderr": 0.02982074719142244 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47327249022164275, "acc_stderr": 0.012751977967676008, "acc_norm": 0.47327249022164275, "acc_norm_stderr": 0.012751977967676008 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6727941176470589, "acc_stderr": 0.02850145286039655, "acc_norm": 0.6727941176470589, "acc_norm_stderr": 0.02850145286039655 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6650326797385621, "acc_stderr": 0.01909422816700032, "acc_norm": 0.6650326797385621, "acc_norm_stderr": 0.01909422816700032 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8109452736318408, "acc_stderr": 0.027686913588013007, "acc_norm": 0.8109452736318408, "acc_norm_stderr": 0.027686913588013007 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.572289156626506, "acc_stderr": 0.038515976837185335, "acc_norm": 0.572289156626506, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.3818849449204406, "mc1_stderr": 0.017008101939163495, "mc2": 0.5554845037946894, "mc2_stderr": 0.015378736360042876 }, "harness|winogrande|5": { "acc": 0.7892659826361483, "acc_stderr": 0.01146204641971067 }, "harness|gsm8k|5": { "acc": 0.604245640636846, "acc_stderr": 0.013469823701048806 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_decruz07__kellemar-DPO-7B
[ "region:us" ]
2024-01-11T04:03:56+00:00
{"pretty_name": "Evaluation run of decruz07/kellemar-DPO-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [decruz07/kellemar-DPO-7B](https://huggingface.co/decruz07/kellemar-DPO-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_decruz07__kellemar-DPO-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T06:58:28.496498](https://huggingface.co/datasets/open-llm-leaderboard/details_decruz07__kellemar-DPO-7B/blob/main/results_2024-01-11T06-58-28.496498.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6364735443577895,\n \"acc_stderr\": 0.032271218354351386,\n \"acc_norm\": 0.6383021875744406,\n \"acc_norm_stderr\": 0.03291695999578323,\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.5554845037946894,\n \"mc2_stderr\": 0.015378736360042876\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.621160409556314,\n \"acc_stderr\": 0.014175915490000326,\n \"acc_norm\": 0.6604095563139932,\n \"acc_norm_stderr\": 0.013839039762820169\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6629157538338977,\n \"acc_stderr\": 0.004717478335689633,\n \"acc_norm\": 0.8521210914160526,\n \"acc_norm_stderr\": 0.0035425443194051416\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119668,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119668\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.032555253593403555,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.032555253593403555\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.02289168798455495,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.02289168798455495\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886797,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886797\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010344,\n \"acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010344\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516304,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516304\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n \"acc_stderr\": 0.012751977967676008,\n \"acc_norm\": 0.47327249022164275,\n \"acc_norm_stderr\": 0.012751977967676008\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039655,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039655\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700032,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700032\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.027686913588013007,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.027686913588013007\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.5554845037946894,\n \"mc2_stderr\": 0.015378736360042876\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.01146204641971067\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.604245640636846,\n \"acc_stderr\": 0.013469823701048806\n }\n}\n```", "repo_url": "https://huggingface.co/decruz07/kellemar-DPO-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|arc:challenge|25_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|arc:challenge|25_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|gsm8k|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|gsm8k|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hellaswag|10_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hellaswag|10_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T04-01-32.191917.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T06-58-28.496498.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["**/details_harness|winogrande|5_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["**/details_harness|winogrande|5_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T06-58-28.496498.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T04_01_32.191917", "path": ["results_2024-01-11T04-01-32.191917.parquet"]}, {"split": "2024_01_11T06_58_28.496498", "path": ["results_2024-01-11T06-58-28.496498.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T06-58-28.496498.parquet"]}]}]}
2024-01-11T07:01:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of decruz07/kellemar-DPO-7B Dataset automatically created during the evaluation run of model decruz07/kellemar-DPO-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T06:58:28.496498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of decruz07/kellemar-DPO-7B\n\n\n\nDataset automatically created during the evaluation run of model decruz07/kellemar-DPO-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T06:58:28.496498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of decruz07/kellemar-DPO-7B\n\n\n\nDataset automatically created during the evaluation run of model decruz07/kellemar-DPO-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T06:58:28.496498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
361fa6cbd10af7c5f3d898a028d2374f4c3bd04e
## KULLM을 baseline으로 RLHF 강화학습을 하는데 사용한 데이터셋입니다. - **Step1: step1_SFT_train.jsonl** (KULLM 12.8B 모델을 Supervised Fine-Tuning 하는데 사용하였습니다.) - **Step2: step2_RM_train.json** (polyglot-ko 1.3B 모델을 Reward Model로 학습하는데 사용하였습니다.) - **Step3: step3_PPO_train.json** (SFT 모델과 RM 모델을 사용하여 RLHF 학습을 하는데 사용하였습니다.) 자세한 정보는 다음을 참고해주세요: https://huggingface.co/Trofish/KULLM-RLHF ## 강화학습 단계별 데이터셋 구축 ![image](https://github.com/VAIV-2023/VAIV2023/assets/79634774/a4988abd-c6fd-4fc2-8e53-9a02240e2275) ![image](https://github.com/VAIV-2023/VAIV2023/assets/79634774/dae49a1e-a834-463c-9f95-34cf254fdaeb) ## 데이터셋 선정 시 고려 사항 - **일상 대화와 혐오 표현 대처 능력을 올리기 위한 데이터셋과, 학습 시 챗봇 모델의 general한 task에 대한 성능이 하락하는 것을 막기 위해서 general task 데이터셋을 구성** - **국립국어원 일상 대화 데이터셋:** 일상적인 대화에 대한 자연스러운 응답이 있으면서도, 맞춤법이 잘 지켜지고 은어, 비문, 초성 등이 없으며 주제별로 다양한 대화가 있음 - **AI Hub 혐오 표현 데이터셋:** 혐오, 차별, 성적인 내용, 폭력, 범죄 등 카테고리별로 다양한 혐오 표현이 있음 - **General task 데이터셋** - Evol-Instruct 데이터셋: 다양한 분야에 대한 복잡하고 논리적인 prompt와 답변이 있음 - Self-Instruct 데이터셋: 사람이 직접 생성한 양질의 Seed data를 기반으로 데이터 증강 - RLHF 한국어 번역 데이터셋: DeepSpeedChat에서 공개한 데이터셋을 한국어로 번역 # Step1. SFT 모델 Fine-tuning ## Baseline Model [- 고려대학교 NLP & AI 연구실과 HIAI 연구소가 개발한 한국어 LLM **"KULLM"** 사용](https://github.com/nlpai-lab/KULLM) ## Datasets ![image](https://github.com/VAIV-2023/VAIV2023/assets/79634774/085610db-3714-43c3-855b-58baad2f4e8b) # Step2. Reward Model ver1 구현 ## Baseline Model - EleutherAI에서 개발한 초거대 한국어 언어 모델 **Polyglot-Ko** 사용 - 1.3b 모델과 5.8b 모델을 각각 실험 ## Datasets ![image](https://github.com/VAIV-2023/RLHF-Korean-Friendly-LLM/assets/79634774/0082da9b-b0b8-4089-8647-cffa5ce724fb) - InstructGPT의 데이터셋 구축 방법 - Reward 모델 학습 데이터셋으로 SFT 학습에 사용한 prompt(1,500개 - 일상대화:혐오표현=2:1)와 새로운 prompt(1,000개 - DeepSpeedChat 번역 데이터셋) 사용 - SFT 모델에서 한개의 prompt당 K개의 Response를 생성하고, 순위를 Labeling - 데이터셋 라벨링 - Instruct GPT의 경우 사람이 직접 Labeling을 하엿지만, 일관된 평가와 시간 단축을 위해 GPt-4와 G-Eval을 이용 - SFT에서 생성한 두 Response 중 G-Eval 평가 점수 합이 높은 것을 Chosen response로 결정 - 데이터셋 유형별로 G-Eval 평가 Prompt에 차이를 두었음 - ![image](https://github.com/VAIV-2023/RLHF-Korean-Friendly-LLM/assets/79634774/7d7117d0-02e9-42dd-8ce3-5244cf726bf8) ## RLFH Model Evaluation ![image](https://github.com/VAIV-2023/VAIV2023/assets/79634774/2b58ed3a-7ed5-4e60-ba4b-c9b291b1fdff) ![image](https://github.com/VAIV-2023/VAIV2023/assets/79634774/75b2a1ee-d7c0-4ba9-ab2f-727abab644e9) ## Final RLHF Model - https://huggingface.co/Trofish/KULLM-RLHF
Trofish/Korean-RLHF-Full-process
[ "task_categories:reinforcement-learning", "task_categories:text-generation", "language:ko", "license:cc-by-nc-4.0", "RLHF", "SFT", "RM", "instruction-tuning", "reward-model", "PPO", "region:us" ]
2024-01-11T05:00:38+00:00
{"language": ["ko"], "license": "cc-by-nc-4.0", "task_categories": ["reinforcement-learning", "text-generation"], "tags": ["RLHF", "SFT", "RM", "instruction-tuning", "reward-model", "PPO"]}
2024-01-11T05:17:07+00:00
[]
[ "ko" ]
TAGS #task_categories-reinforcement-learning #task_categories-text-generation #language-Korean #license-cc-by-nc-4.0 #RLHF #SFT #RM #instruction-tuning #reward-model #PPO #region-us
## KULLM을 baseline으로 RLHF 강화학습을 하는데 사용한 데이터셋입니다. - Step1: step1_SFT_train.jsonl (KULLM 12.8B 모델을 Supervised Fine-Tuning 하는데 사용하였습니다.) - Step2: step2_RM_train.json (polyglot-ko 1.3B 모델을 Reward Model로 학습하는데 사용하였습니다.) - Step3: step3_PPO_train.json (SFT 모델과 RM 모델을 사용하여 RLHF 학습을 하는데 사용하였습니다.) 자세한 정보는 다음을 참고해주세요: URL ## 강화학습 단계별 데이터셋 구축 !image !image ## 데이터셋 선정 시 고려 사항 - 일상 대화와 혐오 표현 대처 능력을 올리기 위한 데이터셋과, 학습 시 챗봇 모델의 general한 task에 대한 성능이 하락하는 것을 막기 위해서 general task 데이터셋을 구성 - 국립국어원 일상 대화 데이터셋: 일상적인 대화에 대한 자연스러운 응답이 있으면서도, 맞춤법이 잘 지켜지고 은어, 비문, 초성 등이 없으며 주제별로 다양한 대화가 있음 - AI Hub 혐오 표현 데이터셋: 혐오, 차별, 성적인 내용, 폭력, 범죄 등 카테고리별로 다양한 혐오 표현이 있음 - General task 데이터셋 - Evol-Instruct 데이터셋: 다양한 분야에 대한 복잡하고 논리적인 prompt와 답변이 있음 - Self-Instruct 데이터셋: 사람이 직접 생성한 양질의 Seed data를 기반으로 데이터 증강 - RLHF 한국어 번역 데이터셋: DeepSpeedChat에서 공개한 데이터셋을 한국어로 번역 # Step1. SFT 모델 Fine-tuning ## Baseline Model - 고려대학교 NLP & AI 연구실과 HIAI 연구소가 개발한 한국어 LLM "KULLM" 사용 ## Datasets !image # Step2. Reward Model ver1 구현 ## Baseline Model - EleutherAI에서 개발한 초거대 한국어 언어 모델 Polyglot-Ko 사용 - 1.3b 모델과 5.8b 모델을 각각 실험 ## Datasets !image - InstructGPT의 데이터셋 구축 방법 - Reward 모델 학습 데이터셋으로 SFT 학습에 사용한 prompt(1,500개 - 일상대화:혐오표현=2:1)와 새로운 prompt(1,000개 - DeepSpeedChat 번역 데이터셋) 사용 - SFT 모델에서 한개의 prompt당 K개의 Response를 생성하고, 순위를 Labeling - 데이터셋 라벨링 - Instruct GPT의 경우 사람이 직접 Labeling을 하엿지만, 일관된 평가와 시간 단축을 위해 GPt-4와 G-Eval을 이용 - SFT에서 생성한 두 Response 중 G-Eval 평가 점수 합이 높은 것을 Chosen response로 결정 - 데이터셋 유형별로 G-Eval 평가 Prompt에 차이를 두었음 - !image ## RLFH Model Evaluation !image !image ## Final RLHF Model - URL
[ "## KULLM을 baseline으로 RLHF 강화학습을 하는데 사용한 데이터셋입니다.\n- Step1: step1_SFT_train.jsonl (KULLM 12.8B 모델을 Supervised Fine-Tuning 하는데 사용하였습니다.)\n- Step2: step2_RM_train.json (polyglot-ko 1.3B 모델을 Reward Model로 학습하는데 사용하였습니다.)\n- Step3: step3_PPO_train.json (SFT 모델과 RM 모델을 사용하여 RLHF 학습을 하는데 사용하였습니다.)\n자세한 정보는 다음을 참고해주세요: URL", "## 강화학습 단계별 데이터셋 구축\n!image\n!image", "## 데이터셋 선정 시 고려 사항\n- 일상 대화와 혐오 표현 대처 능력을 올리기 위한 데이터셋과, 학습 시 챗봇 모델의 general한 task에 대한 성능이 하락하는 것을 막기 위해서 general task 데이터셋을 구성\n \n- 국립국어원 일상 대화 데이터셋: 일상적인 대화에 대한 자연스러운 응답이 있으면서도, 맞춤법이 잘 지켜지고 은어, 비문, 초성 등이 없으며 주제별로 다양한 대화가 있음\n \n- AI Hub 혐오 표현 데이터셋: 혐오, 차별, 성적인 내용, 폭력, 범죄 등 카테고리별로 다양한 혐오 표현이 있음\n \n- General task 데이터셋\n - Evol-Instruct 데이터셋: 다양한 분야에 대한 복잡하고 논리적인 prompt와 답변이 있음\n - Self-Instruct 데이터셋: 사람이 직접 생성한 양질의 Seed data를 기반으로 데이터 증강\n - RLHF 한국어 번역 데이터셋: DeepSpeedChat에서 공개한 데이터셋을 한국어로 번역", "# Step1. SFT 모델 Fine-tuning", "## Baseline Model\n- 고려대학교 NLP & AI 연구실과 HIAI 연구소가 개발한 한국어 LLM \"KULLM\" 사용", "## Datasets\n!image", "# Step2. Reward Model ver1 구현", "## Baseline Model\n- EleutherAI에서 개발한 초거대 한국어 언어 모델 Polyglot-Ko 사용\n- 1.3b 모델과 5.8b 모델을 각각 실험", "## Datasets\n!image\n- InstructGPT의 데이터셋 구축 방법\n - Reward 모델 학습 데이터셋으로 SFT 학습에 사용한 prompt(1,500개 - 일상대화:혐오표현=2:1)와 새로운 prompt(1,000개 - DeepSpeedChat 번역 데이터셋) 사용 \n - SFT 모델에서 한개의 prompt당 K개의 Response를 생성하고, 순위를 Labeling\n- 데이터셋 라벨링\n - Instruct GPT의 경우 사람이 직접 Labeling을 하엿지만, 일관된 평가와 시간 단축을 위해 GPt-4와 G-Eval을 이용\n - SFT에서 생성한 두 Response 중 G-Eval 평가 점수 합이 높은 것을 Chosen response로 결정\n - 데이터셋 유형별로 G-Eval 평가 Prompt에 차이를 두었음\n - !image", "## RLFH Model Evaluation\n!image\n!image", "## Final RLHF Model\n- URL" ]
[ "TAGS\n#task_categories-reinforcement-learning #task_categories-text-generation #language-Korean #license-cc-by-nc-4.0 #RLHF #SFT #RM #instruction-tuning #reward-model #PPO #region-us \n", "## KULLM을 baseline으로 RLHF 강화학습을 하는데 사용한 데이터셋입니다.\n- Step1: step1_SFT_train.jsonl (KULLM 12.8B 모델을 Supervised Fine-Tuning 하는데 사용하였습니다.)\n- Step2: step2_RM_train.json (polyglot-ko 1.3B 모델을 Reward Model로 학습하는데 사용하였습니다.)\n- Step3: step3_PPO_train.json (SFT 모델과 RM 모델을 사용하여 RLHF 학습을 하는데 사용하였습니다.)\n자세한 정보는 다음을 참고해주세요: URL", "## 강화학습 단계별 데이터셋 구축\n!image\n!image", "## 데이터셋 선정 시 고려 사항\n- 일상 대화와 혐오 표현 대처 능력을 올리기 위한 데이터셋과, 학습 시 챗봇 모델의 general한 task에 대한 성능이 하락하는 것을 막기 위해서 general task 데이터셋을 구성\n \n- 국립국어원 일상 대화 데이터셋: 일상적인 대화에 대한 자연스러운 응답이 있으면서도, 맞춤법이 잘 지켜지고 은어, 비문, 초성 등이 없으며 주제별로 다양한 대화가 있음\n \n- AI Hub 혐오 표현 데이터셋: 혐오, 차별, 성적인 내용, 폭력, 범죄 등 카테고리별로 다양한 혐오 표현이 있음\n \n- General task 데이터셋\n - Evol-Instruct 데이터셋: 다양한 분야에 대한 복잡하고 논리적인 prompt와 답변이 있음\n - Self-Instruct 데이터셋: 사람이 직접 생성한 양질의 Seed data를 기반으로 데이터 증강\n - RLHF 한국어 번역 데이터셋: DeepSpeedChat에서 공개한 데이터셋을 한국어로 번역", "# Step1. SFT 모델 Fine-tuning", "## Baseline Model\n- 고려대학교 NLP & AI 연구실과 HIAI 연구소가 개발한 한국어 LLM \"KULLM\" 사용", "## Datasets\n!image", "# Step2. Reward Model ver1 구현", "## Baseline Model\n- EleutherAI에서 개발한 초거대 한국어 언어 모델 Polyglot-Ko 사용\n- 1.3b 모델과 5.8b 모델을 각각 실험", "## Datasets\n!image\n- InstructGPT의 데이터셋 구축 방법\n - Reward 모델 학습 데이터셋으로 SFT 학습에 사용한 prompt(1,500개 - 일상대화:혐오표현=2:1)와 새로운 prompt(1,000개 - DeepSpeedChat 번역 데이터셋) 사용 \n - SFT 모델에서 한개의 prompt당 K개의 Response를 생성하고, 순위를 Labeling\n- 데이터셋 라벨링\n - Instruct GPT의 경우 사람이 직접 Labeling을 하엿지만, 일관된 평가와 시간 단축을 위해 GPt-4와 G-Eval을 이용\n - SFT에서 생성한 두 Response 중 G-Eval 평가 점수 합이 높은 것을 Chosen response로 결정\n - 데이터셋 유형별로 G-Eval 평가 Prompt에 차이를 두었음\n - !image", "## RLFH Model Evaluation\n!image\n!image", "## Final RLHF Model\n- URL" ]
e4b38aa0d66e5e639c6876d09b61416199cb06b4
# Dataset Card for Evaluation run of vihangd/DopeyTinyLlama-1.1B-v1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [vihangd/DopeyTinyLlama-1.1B-v1](https://huggingface.co/vihangd/DopeyTinyLlama-1.1B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_vihangd__DopeyTinyLlama-1.1B-v1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T05:11:02.109098](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__DopeyTinyLlama-1.1B-v1/blob/main/results_2024-01-11T05-11-02.109098.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2668796023902934, "acc_stderr": 0.030998189550288012, "acc_norm": 0.26612624532338797, "acc_norm_stderr": 0.0317752142871891, "mc1": 0.23623011015911874, "mc1_stderr": 0.014869755015871114, "mc2": 0.3736095253552331, "mc2_stderr": 0.014360223110249206 }, "harness|arc:challenge|25": { "acc": 0.36689419795221845, "acc_stderr": 0.014084133118104294, "acc_norm": 0.3839590443686007, "acc_norm_stderr": 0.014212444980651889 }, "harness|hellaswag|10": { "acc": 0.4779924317864967, "acc_stderr": 0.004984945635998307, "acc_norm": 0.6349332802230632, "acc_norm_stderr": 0.0048046491971637005 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2074074074074074, "acc_stderr": 0.03502553170678318, "acc_norm": 0.2074074074074074, "acc_norm_stderr": 0.03502553170678318 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.18421052631578946, "acc_stderr": 0.0315469804508223, "acc_norm": 0.18421052631578946, "acc_norm_stderr": 0.0315469804508223 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2792452830188679, "acc_stderr": 0.027611163402399715, "acc_norm": 0.2792452830188679, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.25, "acc_stderr": 0.03621034121889507, "acc_norm": 0.25, "acc_norm_stderr": 0.03621034121889507 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2254335260115607, "acc_stderr": 0.03186209851641143, "acc_norm": 0.2254335260115607, "acc_norm_stderr": 0.03186209851641143 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.16666666666666666, "acc_stderr": 0.03708284662416542, "acc_norm": 0.16666666666666666, "acc_norm_stderr": 0.03708284662416542 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.251063829787234, "acc_stderr": 0.02834696377716245, "acc_norm": 0.251063829787234, "acc_norm_stderr": 0.02834696377716245 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.18421052631578946, "acc_stderr": 0.03646758875075566, "acc_norm": 0.18421052631578946, "acc_norm_stderr": 0.03646758875075566 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2, "acc_stderr": 0.03333333333333329, "acc_norm": 0.2, "acc_norm_stderr": 0.03333333333333329 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2566137566137566, "acc_stderr": 0.022494510767503154, "acc_norm": 0.2566137566137566, "acc_norm_stderr": 0.022494510767503154 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.21428571428571427, "acc_stderr": 0.03670066451047181, "acc_norm": 0.21428571428571427, "acc_norm_stderr": 0.03670066451047181 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.24193548387096775, "acc_stderr": 0.024362599693031086, "acc_norm": 0.24193548387096775, "acc_norm_stderr": 0.024362599693031086 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.26108374384236455, "acc_stderr": 0.030903796952114492, "acc_norm": 0.26108374384236455, "acc_norm_stderr": 0.030903796952114492 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.22, "acc_stderr": 0.041633319989322695, "acc_norm": 0.22, "acc_norm_stderr": 0.041633319989322695 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.3090909090909091, "acc_stderr": 0.03608541011573967, "acc_norm": 0.3090909090909091, "acc_norm_stderr": 0.03608541011573967 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.25252525252525254, "acc_stderr": 0.030954055470365907, "acc_norm": 0.25252525252525254, "acc_norm_stderr": 0.030954055470365907 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.26424870466321243, "acc_stderr": 0.03182155050916648, "acc_norm": 0.26424870466321243, "acc_norm_stderr": 0.03182155050916648 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2717948717948718, "acc_stderr": 0.022556551010132354, "acc_norm": 0.2717948717948718, "acc_norm_stderr": 0.022556551010132354 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24814814814814815, "acc_stderr": 0.0263357394040558, "acc_norm": 0.24814814814814815, "acc_norm_stderr": 0.0263357394040558 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.23109243697478993, "acc_stderr": 0.027381406927868956, "acc_norm": 0.23109243697478993, "acc_norm_stderr": 0.027381406927868956 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.1986754966887417, "acc_stderr": 0.032578473844367746, "acc_norm": 0.1986754966887417, "acc_norm_stderr": 0.032578473844367746 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.24220183486238533, "acc_stderr": 0.018368176306598618, "acc_norm": 0.24220183486238533, "acc_norm_stderr": 0.018368176306598618 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4351851851851852, "acc_stderr": 0.03381200005643525, "acc_norm": 0.4351851851851852, "acc_norm_stderr": 0.03381200005643525 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.24509803921568626, "acc_stderr": 0.03019028245350194, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.03019028245350194 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.2489451476793249, "acc_stderr": 0.028146970599422644, "acc_norm": 0.2489451476793249, "acc_norm_stderr": 0.028146970599422644 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3632286995515695, "acc_stderr": 0.03227790442850499, "acc_norm": 0.3632286995515695, "acc_norm_stderr": 0.03227790442850499 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.24074074074074073, "acc_stderr": 0.04133119440243838, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.04133119440243838 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.26380368098159507, "acc_stderr": 0.03462419931615623, "acc_norm": 0.26380368098159507, "acc_norm_stderr": 0.03462419931615623 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04287858751340455, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04287858751340455 }, "harness|hendrycksTest-management|5": { "acc": 0.2621359223300971, "acc_stderr": 0.04354631077260597, "acc_norm": 0.2621359223300971, "acc_norm_stderr": 0.04354631077260597 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2863247863247863, "acc_stderr": 0.02961432369045665, "acc_norm": 0.2863247863247863, "acc_norm_stderr": 0.02961432369045665 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2681992337164751, "acc_stderr": 0.015842430835269435, "acc_norm": 0.2681992337164751, "acc_norm_stderr": 0.015842430835269435 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.23410404624277456, "acc_stderr": 0.02279711027807114, "acc_norm": 0.23410404624277456, "acc_norm_stderr": 0.02279711027807114 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24804469273743016, "acc_stderr": 0.014444157808261427, "acc_norm": 0.24804469273743016, "acc_norm_stderr": 0.014444157808261427 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.23529411764705882, "acc_stderr": 0.02428861946604611, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.02428861946604611 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.31511254019292606, "acc_stderr": 0.026385273703464482, "acc_norm": 0.31511254019292606, "acc_norm_stderr": 0.026385273703464482 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2623456790123457, "acc_stderr": 0.02447722285613511, "acc_norm": 0.2623456790123457, "acc_norm_stderr": 0.02447722285613511 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2730496453900709, "acc_stderr": 0.026577860943307847, "acc_norm": 0.2730496453900709, "acc_norm_stderr": 0.026577860943307847 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2379400260756193, "acc_stderr": 0.01087570078769424, "acc_norm": 0.2379400260756193, "acc_norm_stderr": 0.01087570078769424 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.2647058823529412, "acc_stderr": 0.026799562024887674, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.026799562024887674 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25163398692810457, "acc_stderr": 0.01755581809132227, "acc_norm": 0.25163398692810457, "acc_norm_stderr": 0.01755581809132227 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.3, "acc_stderr": 0.04389311454644286, "acc_norm": 0.3, "acc_norm_stderr": 0.04389311454644286 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.1673469387755102, "acc_stderr": 0.023897144768914524, "acc_norm": 0.1673469387755102, "acc_norm_stderr": 0.023897144768914524 }, "harness|hendrycksTest-sociology|5": { "acc": 0.23880597014925373, "acc_stderr": 0.03014777593540922, "acc_norm": 0.23880597014925373, "acc_norm_stderr": 0.03014777593540922 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-virology|5": { "acc": 0.3192771084337349, "acc_stderr": 0.03629335329947861, "acc_norm": 0.3192771084337349, "acc_norm_stderr": 0.03629335329947861 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.21637426900584794, "acc_stderr": 0.031581495393387345, "acc_norm": 0.21637426900584794, "acc_norm_stderr": 0.031581495393387345 }, "harness|truthfulqa:mc|0": { "mc1": 0.23623011015911874, "mc1_stderr": 0.014869755015871114, "mc2": 0.3736095253552331, "mc2_stderr": 0.014360223110249206 }, "harness|winogrande|5": { "acc": 0.734017363851618, "acc_stderr": 0.012418323153051043 }, "harness|gsm8k|5": { "acc": 0.01819560272934041, "acc_stderr": 0.003681611894073874 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_vihangd__DopeyTinyLlama-1.1B-v1
[ "region:us" ]
2024-01-11T05:12:47+00:00
{"pretty_name": "Evaluation run of vihangd/DopeyTinyLlama-1.1B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [vihangd/DopeyTinyLlama-1.1B-v1](https://huggingface.co/vihangd/DopeyTinyLlama-1.1B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vihangd__DopeyTinyLlama-1.1B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T05:11:02.109098](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__DopeyTinyLlama-1.1B-v1/blob/main/results_2024-01-11T05-11-02.109098.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2668796023902934,\n \"acc_stderr\": 0.030998189550288012,\n \"acc_norm\": 0.26612624532338797,\n \"acc_norm_stderr\": 0.0317752142871891,\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871114,\n \"mc2\": 0.3736095253552331,\n \"mc2_stderr\": 0.014360223110249206\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.36689419795221845,\n \"acc_stderr\": 0.014084133118104294,\n \"acc_norm\": 0.3839590443686007,\n \"acc_norm_stderr\": 0.014212444980651889\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4779924317864967,\n \"acc_stderr\": 0.004984945635998307,\n \"acc_norm\": 0.6349332802230632,\n \"acc_norm_stderr\": 0.0048046491971637005\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2074074074074074,\n \"acc_stderr\": 0.03502553170678318,\n \"acc_norm\": 0.2074074074074074,\n \"acc_norm_stderr\": 0.03502553170678318\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.03186209851641143,\n \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.03186209851641143\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416542,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416542\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.02834696377716245,\n \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.02834696377716245\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.03646758875075566,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.03646758875075566\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03333333333333329,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03333333333333329\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24193548387096775,\n \"acc_stderr\": 0.024362599693031086,\n \"acc_norm\": 0.24193548387096775,\n \"acc_norm_stderr\": 0.024362599693031086\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114492,\n \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114492\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.3090909090909091,\n \"acc_stderr\": 0.03608541011573967,\n \"acc_norm\": 0.3090909090909091,\n \"acc_norm_stderr\": 0.03608541011573967\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365907,\n \"acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365907\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.03182155050916648,\n \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.03182155050916648\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2717948717948718,\n \"acc_stderr\": 0.022556551010132354,\n \"acc_norm\": 0.2717948717948718,\n \"acc_norm_stderr\": 0.022556551010132354\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868956,\n \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868956\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.032578473844367746,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.032578473844367746\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24220183486238533,\n \"acc_stderr\": 0.018368176306598618,\n \"acc_norm\": 0.24220183486238533,\n \"acc_norm_stderr\": 0.018368176306598618\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350194,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350194\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.3632286995515695,\n \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260597,\n \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260597\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.2863247863247863,\n \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2681992337164751,\n \"acc_stderr\": 0.015842430835269435,\n \"acc_norm\": 0.2681992337164751,\n \"acc_norm_stderr\": 0.015842430835269435\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.02279711027807114,\n \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.02279711027807114\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n \"acc_stderr\": 0.014444157808261427,\n \"acc_norm\": 0.24804469273743016,\n \"acc_norm_stderr\": 0.014444157808261427\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02428861946604611,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02428861946604611\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.31511254019292606,\n \"acc_stderr\": 0.026385273703464482,\n \"acc_norm\": 0.31511254019292606,\n \"acc_norm_stderr\": 0.026385273703464482\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.02447722285613511,\n \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.02447722285613511\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307847,\n \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307847\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2379400260756193,\n \"acc_stderr\": 0.01087570078769424,\n \"acc_norm\": 0.2379400260756193,\n \"acc_norm_stderr\": 0.01087570078769424\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.026799562024887674,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.026799562024887674\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.01755581809132227,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.01755581809132227\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.1673469387755102,\n \"acc_stderr\": 0.023897144768914524,\n \"acc_norm\": 0.1673469387755102,\n \"acc_norm_stderr\": 0.023897144768914524\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n \"acc_stderr\": 0.03629335329947861,\n \"acc_norm\": 0.3192771084337349,\n \"acc_norm_stderr\": 0.03629335329947861\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21637426900584794,\n \"acc_stderr\": 0.031581495393387345,\n \"acc_norm\": 0.21637426900584794,\n \"acc_norm_stderr\": 0.031581495393387345\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871114,\n \"mc2\": 0.3736095253552331,\n \"mc2_stderr\": 0.014360223110249206\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.734017363851618,\n \"acc_stderr\": 0.012418323153051043\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01819560272934041,\n \"acc_stderr\": 0.003681611894073874\n }\n}\n```", "repo_url": "https://huggingface.co/vihangd/DopeyTinyLlama-1.1B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|arc:challenge|25_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|gsm8k|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hellaswag|10_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T05-11-02.109098.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["**/details_harness|winogrande|5_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T05-11-02.109098.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T05_11_02.109098", "path": ["results_2024-01-11T05-11-02.109098.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T05-11-02.109098.parquet"]}]}]}
2024-01-11T05:13:08+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of vihangd/DopeyTinyLlama-1.1B-v1 Dataset automatically created during the evaluation run of model vihangd/DopeyTinyLlama-1.1B-v1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T05:11:02.109098(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of vihangd/DopeyTinyLlama-1.1B-v1\n\n\n\nDataset automatically created during the evaluation run of model vihangd/DopeyTinyLlama-1.1B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T05:11:02.109098(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of vihangd/DopeyTinyLlama-1.1B-v1\n\n\n\nDataset automatically created during the evaluation run of model vihangd/DopeyTinyLlama-1.1B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T05:11:02.109098(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
5a6b5e4421c384dd3613d1c6553518e6d63e6eff
# Metadata Database for Danbooru2023 Danbooru 2023 datasets: https://huggingface.co/datasets/nyanko7/danbooru2023 This dataset contains a sqlite db file which have all the tags and posts metadata in it.<br> The Peewee ORM config file is provided too, plz check it for more information. (Especially on how I link posts and tags together) The original data is from the official dump of the posts info.<br> Check this [link](https://console.cloud.google.com/storage/browser/danbooru_public/data) for more info. ## Details This section contains some details that you need to be aware of if you want to use other ORM system or use plain SQL query to utilize this database. #### Custom Enum Fields Some fields in Post/Tags use my custom enum field to store type/category or something like that: * Post.rating * 0: general * 1: sensitive * 2: questionable * 3: explicit * Tag.type * 0: general * 1: artist * 2: character * 3: copyright * 4: meta #### Tag List I use peewee ManyToManyField to implement the Tag List things. Which utilize a through model which have all the pair of Tag and Post<br> Since it is very likely we will want to use Tag to query posts, so many-to-many is better.<br> The con of this design is the database file will be 1.5x larger than before(we have 0.25B entries for the post-tag pairs), but the query speed become 2~3x faster, so I think it is acceptable. After done some checking, I can ensure that all the "categorical tag list" can be done by full list + filter, and that is how I done it now. Check the db.py for more details. #### Utils if you think above details are too complicated, just use the db_utils.py and other PeeWee API to utilize this database. I also provide a write_csv.py for exporting whole dataset into csv for data analysis. ## License The source code, database file of this repo is licensed under MiT License.<br> **Notice**: The license doesn't cover the "content" of the database.<br> All the content is from official danbooru dumps for posts' meta. ## Acknowledgement Thx for AngelBottomless for fixing wrong entries and add more entries into this dataset:<br> https://huggingface.co/datasets/AngelBottomless/danbooru-2023-sqlite-fixed-7110548 Note: I have changed the definition of TagListField and have added some index into it. Do not mixed up the .db files from 2 different repo.
KBlueLeaf/danbooru2023-sqlite
[ "task_categories:image-classification", "task_categories:text-to-image", "language:en", "license:mit", "region:us" ]
2024-01-11T05:13:15+00:00
{"language": ["en"], "license": "mit", "task_categories": ["image-classification", "text-to-image"]}
2024-02-02T13:23:10+00:00
[]
[ "en" ]
TAGS #task_categories-image-classification #task_categories-text-to-image #language-English #license-mit #region-us
# Metadata Database for Danbooru2023 Danbooru 2023 datasets: URL This dataset contains a sqlite db file which have all the tags and posts metadata in it.<br> The Peewee ORM config file is provided too, plz check it for more information. (Especially on how I link posts and tags together) The original data is from the official dump of the posts info.<br> Check this link for more info. ## Details This section contains some details that you need to be aware of if you want to use other ORM system or use plain SQL query to utilize this database. #### Custom Enum Fields Some fields in Post/Tags use my custom enum field to store type/category or something like that: * URL * 0: general * 1: sensitive * 2: questionable * 3: explicit * URL * 0: general * 1: artist * 2: character * 3: copyright * 4: meta #### Tag List I use peewee ManyToManyField to implement the Tag List things. Which utilize a through model which have all the pair of Tag and Post<br> Since it is very likely we will want to use Tag to query posts, so many-to-many is better.<br> The con of this design is the database file will be 1.5x larger than before(we have 0.25B entries for the post-tag pairs), but the query speed become 2~3x faster, so I think it is acceptable. After done some checking, I can ensure that all the "categorical tag list" can be done by full list + filter, and that is how I done it now. Check the URL for more details. #### Utils if you think above details are too complicated, just use the db_utils.py and other PeeWee API to utilize this database. I also provide a write_csv.py for exporting whole dataset into csv for data analysis. ## License The source code, database file of this repo is licensed under MiT License.<br> Notice: The license doesn't cover the "content" of the database.<br> All the content is from official danbooru dumps for posts' meta. ## Acknowledgement Thx for AngelBottomless for fixing wrong entries and add more entries into this dataset:<br> URL Note: I have changed the definition of TagListField and have added some index into it. Do not mixed up the .db files from 2 different repo.
[ "# Metadata Database for Danbooru2023\nDanbooru 2023 datasets: URL\n\nThis dataset contains a sqlite db file which have all the tags and posts metadata in it.<br>\nThe Peewee ORM config file is provided too, plz check it for more information. (Especially on how I link posts and tags together)\n\nThe original data is from the official dump of the posts info.<br>\nCheck this link for more info.", "## Details\nThis section contains some details that you need to be aware of if you want to use other ORM system or use plain SQL query to utilize this database.", "#### Custom Enum Fields\nSome fields in Post/Tags use my custom enum field to store type/category or something like that:\n\n* URL\n * 0: general\n * 1: sensitive\n * 2: questionable\n * 3: explicit\n* URL\n * 0: general\n * 1: artist\n * 2: character\n * 3: copyright\n * 4: meta", "#### Tag List\nI use peewee ManyToManyField to implement the Tag List things. Which utilize a through model which have all the pair of Tag and Post<br>\nSince it is very likely we will want to use Tag to query posts, so many-to-many is better.<br>\nThe con of this design is the database file will be 1.5x larger than before(we have 0.25B entries for the post-tag pairs), but the query speed become 2~3x faster, so I think it is acceptable.\n\nAfter done some checking, I can ensure that all the \"categorical tag list\" can be done by full list + filter, and that is how I done it now. Check the URL for more details.", "#### Utils\nif you think above details are too complicated, just use the db_utils.py and other PeeWee API to utilize this database.\nI also provide a write_csv.py for exporting whole dataset into csv for data analysis.", "## License\nThe source code, database file of this repo is licensed under MiT License.<br>\nNotice: The license doesn't cover the \"content\" of the database.<br>\nAll the content is from official danbooru dumps for posts' meta.", "## Acknowledgement\nThx for AngelBottomless for fixing wrong entries and add more entries into this dataset:<br>\nURL\n\nNote: I have changed the definition of TagListField and have added some index into it. Do not mixed up the .db files from 2 different repo." ]
[ "TAGS\n#task_categories-image-classification #task_categories-text-to-image #language-English #license-mit #region-us \n", "# Metadata Database for Danbooru2023\nDanbooru 2023 datasets: URL\n\nThis dataset contains a sqlite db file which have all the tags and posts metadata in it.<br>\nThe Peewee ORM config file is provided too, plz check it for more information. (Especially on how I link posts and tags together)\n\nThe original data is from the official dump of the posts info.<br>\nCheck this link for more info.", "## Details\nThis section contains some details that you need to be aware of if you want to use other ORM system or use plain SQL query to utilize this database.", "#### Custom Enum Fields\nSome fields in Post/Tags use my custom enum field to store type/category or something like that:\n\n* URL\n * 0: general\n * 1: sensitive\n * 2: questionable\n * 3: explicit\n* URL\n * 0: general\n * 1: artist\n * 2: character\n * 3: copyright\n * 4: meta", "#### Tag List\nI use peewee ManyToManyField to implement the Tag List things. Which utilize a through model which have all the pair of Tag and Post<br>\nSince it is very likely we will want to use Tag to query posts, so many-to-many is better.<br>\nThe con of this design is the database file will be 1.5x larger than before(we have 0.25B entries for the post-tag pairs), but the query speed become 2~3x faster, so I think it is acceptable.\n\nAfter done some checking, I can ensure that all the \"categorical tag list\" can be done by full list + filter, and that is how I done it now. Check the URL for more details.", "#### Utils\nif you think above details are too complicated, just use the db_utils.py and other PeeWee API to utilize this database.\nI also provide a write_csv.py for exporting whole dataset into csv for data analysis.", "## License\nThe source code, database file of this repo is licensed under MiT License.<br>\nNotice: The license doesn't cover the \"content\" of the database.<br>\nAll the content is from official danbooru dumps for posts' meta.", "## Acknowledgement\nThx for AngelBottomless for fixing wrong entries and add more entries into this dataset:<br>\nURL\n\nNote: I have changed the definition of TagListField and have added some index into it. Do not mixed up the .db files from 2 different repo." ]
490b89f9b1654966de6273da3acee9daa2617890
# Fine-tuned XLSR-53 large model for speech recognition in Japanese Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Japanese using the train and validation splits of [Common Voice 6.1](https://huggingface.co/datasets/common_voice), [CSS10](https://github.com/Kyubyong/css10) and [JSUT](https://sites.google.com/site/shinnosuketakamichi/publication/jsut). When using this model, make sure that your speech input is sampled at 16kHz. This model has been fine-tuned thanks to the GPU credits generously given by the [OVHcloud](https://www.ovhcloud.com/en/public-cloud/ai-training/) :) The script used for training can be found here: https://github.com/jonatasgrosman/wav2vec2-sprint ## Usage The model can be used directly (without a language model) as follows... Using the [HuggingSound](https://github.com/jonatasgrosman/huggingsound) library: ```python from huggingsound import SpeechRecognitionModel model = SpeechRecognitionModel("jonatasgrosman/wav2vec2-large-xlsr-53-japanese") audio_paths = ["/path/to/file.mp3", "/path/to/another_file.wav"] transcriptions = model.transcribe(audio_paths) ``` Writing your own inference script: ```python import torch import librosa from datasets import load_dataset from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor LANG_ID = "ja" MODEL_ID = "jonatasgrosman/wav2vec2-large-xlsr-53-japanese" SAMPLES = 10 test_dataset = load_dataset("common_voice", LANG_ID, split=f"test[:{SAMPLES}]") processor = Wav2Vec2Processor.from_pretrained(MODEL_ID) model = Wav2Vec2ForCTC.from_pretrained(MODEL_ID) # Preprocessing the datasets. # We need to read the audio files as arrays def speech_file_to_array_fn(batch): speech_array, sampling_rate = librosa.load(batch["path"], sr=16_000) batch["speech"] = speech_array batch["sentence"] = batch["sentence"].upper() return batch test_dataset = test_dataset.map(speech_file_to_array_fn) inputs = processor(test_dataset["speech"], sampling_rate=16_000, return_tensors="pt", padding=True) with torch.no_grad(): logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits predicted_ids = torch.argmax(logits, dim=-1) predicted_sentences = processor.batch_decode(predicted_ids) for i, predicted_sentence in enumerate(predicted_sentences): print("-" * 100) print("Reference:", test_dataset[i]["sentence"]) print("Prediction:", predicted_sentence) ``` | Reference | Prediction | | ------------- | ------------- | | 祖母は、おおむね機嫌よく、サイコロをころがしている。 | 人母は重にきね起くさいがしている | | 財布をなくしたので、交番へ行きます。 | 財布をなく手端ので勾番へ行きます | | 飲み屋のおやじ、旅館の主人、医者をはじめ、交際のある人にきいてまわったら、みんな、私より収入が多いはずなのに、税金は安い。 | ノ宮屋のお親じ旅館の主に医者をはじめ交際のアル人トに聞いて回ったらみんな私より収入が多いはなうに税金は安い | | 新しい靴をはいて出かけます。 | だらしい靴をはいて出かけます | | このためプラズマ中のイオンや電子の持つ平均運動エネルギーを温度で表現することがある | このためプラズマ中のイオンや電子の持つ平均運動エネルギーを温度で表弁することがある | | 松井さんはサッカーより野球のほうが上手です。 | 松井さんはサッカーより野球のほうが上手です | | 新しいお皿を使います。 | 新しいお皿を使います | | 結婚以来三年半ぶりの東京も、旧友とのお酒も、夜行列車も、駅で寝て、朝を待つのも久しぶりだ。 | 結婚ル二来三年半降りの東京も吸とのお酒も野越者も駅で寝て朝を待つの久しぶりた | | これまで、少年野球、ママさんバレーなど、地域スポーツを支え、市民に密着してきたのは、無数のボランティアだった。 | これまで少年野球<unk>三バレーなど地域スポーツを支え市民に満着してきたのは娘数のボランティアだった | | 靴を脱いで、スリッパをはきます。 | 靴を脱いでスイパーをはきます | ## Evaluation The model can be evaluated as follows on the Japanese test data of Common Voice. ```python import torch import re import librosa from datasets import load_dataset, load_metric from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor LANG_ID = "ja" MODEL_ID = "jonatasgrosman/wav2vec2-large-xlsr-53-japanese" DEVICE = "cuda" CHARS_TO_IGNORE = [",", "?", "¿", ".", "!", "¡", ";", ";", ":", '""', "%", '"', "�", "ʿ", "·", "჻", "~", "՞", "؟", "،", "।", "॥", "«", "»", "„", "“", "”", "「", "」", "‘", "’", "《", "》", "(", ")", "[", "]", "{", "}", "=", "`", "_", "+", "<", ">", "…", "–", "°", "´", "ʾ", "‹", "›", "©", "®", "—", "→", "。", "、", "﹂", "﹁", "‧", "~", "﹏", ",", "{", "}", "(", ")", "[", "]", "【", "】", "‥", "〽", "『", "』", "〝", "〟", "⟨", "⟩", "〜", ":", "!", "?", "♪", "؛", "/", "\\", "º", "−", "^", "'", "ʻ", "ˆ"] test_dataset = load_dataset("common_voice", LANG_ID, split="test") wer = load_metric("wer.py") # https://github.com/jonatasgrosman/wav2vec2-sprint/blob/main/wer.py cer = load_metric("cer.py") # https://github.com/jonatasgrosman/wav2vec2-sprint/blob/main/cer.py chars_to_ignore_regex = f"[{re.escape(''.join(CHARS_TO_IGNORE))}]" processor = Wav2Vec2Processor.from_pretrained(MODEL_ID) model = Wav2Vec2ForCTC.from_pretrained(MODEL_ID) model.to(DEVICE) # Preprocessing the datasets. # We need to read the audio files as arrays def speech_file_to_array_fn(batch): with warnings.catch_warnings(): warnings.simplefilter("ignore") speech_array, sampling_rate = librosa.load(batch["path"], sr=16_000) batch["speech"] = speech_array batch["sentence"] = re.sub(chars_to_ignore_regex, "", batch["sentence"]).upper() return batch test_dataset = test_dataset.map(speech_file_to_array_fn) # Preprocessing the datasets. # We need to read the audio files as arrays def evaluate(batch): inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True) with torch.no_grad(): logits = model(inputs.input_values.to(DEVICE), attention_mask=inputs.attention_mask.to(DEVICE)).logits pred_ids = torch.argmax(logits, dim=-1) batch["pred_strings"] = processor.batch_decode(pred_ids) return batch result = test_dataset.map(evaluate, batched=True, batch_size=8) predictions = [x.upper() for x in result["pred_strings"]] references = [x.upper() for x in result["sentence"]] print(f"WER: {wer.compute(predictions=predictions, references=references, chunk_size=1000) * 100}") print(f"CER: {cer.compute(predictions=predictions, references=references, chunk_size=1000) * 100}") ``` **Test Result**: In the table below I report the Word Error Rate (WER) and the Character Error Rate (CER) of the model. I ran the evaluation script described above on other models as well (on 2021-05-10). Note that the table below may show different results from those already reported, this may have been caused due to some specificity of the other evaluation scripts used. | Model | WER | CER | | ------------- | ------------- | ------------- | | jonatasgrosman/wav2vec2-large-xlsr-53-japanese | **81.80%** | **20.16%** | | vumichien/wav2vec2-large-xlsr-japanese | 1108.86% | 23.40% | | qqhann/w2v_hf_jsut_xlsr53 | 1012.18% | 70.77% | ## Citation If you want to cite this model you can use this: ```bibtex @misc{grosman2021xlsr53-large-japanese, title={Fine-tuned {XLSR}-53 large model for speech recognition in {J}apanese}, author={Grosman, Jonatas}, howpublished={\url{https://huggingface.co/jonatasgrosman/wav2vec2-large-xlsr-53-japanese}}, year={2021} } ```
Gustav114514/work
[ "language:ja", "license:apache-2.0", "audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week", "region:us" ]
2024-01-11T05:32:50+00:00
{"language": "ja", "license": "apache-2.0", "datasets": ["common_voice"], "metrics": ["wer", "cer"], "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "model-index": [{"name": "XLSR Wav2Vec2 Japanese by Jonatas Grosman", "results": [{"task": {"name": "Speech Recognition", "type": "automatic-speech-recognition"}, "dataset": {"name": "Common Voice ja", "type": "common_voice", "args": "ja"}, "metrics": [{"name": "Test WER", "type": "wer", "value": 81.8}, {"name": "Test CER", "type": "cer", "value": 20.16}]}]}]}
2024-01-11T05:36:43+00:00
[]
[ "ja" ]
TAGS #language-Japanese #license-apache-2.0 #audio #automatic-speech-recognition #speech #xlsr-fine-tuning-week #region-us
Fine-tuned XLSR-53 large model for speech recognition in Japanese ================================================================= Fine-tuned facebook/wav2vec2-large-xlsr-53 on Japanese using the train and validation splits of Common Voice 6.1, CSS10 and JSUT. When using this model, make sure that your speech input is sampled at 16kHz. This model has been fine-tuned thanks to the GPU credits generously given by the OVHcloud :) The script used for training can be found here: URL Usage ----- The model can be used directly (without a language model) as follows... Using the HuggingSound library: Writing your own inference script: Evaluation ---------- The model can be evaluated as follows on the Japanese test data of Common Voice. Test Result: In the table below I report the Word Error Rate (WER) and the Character Error Rate (CER) of the model. I ran the evaluation script described above on other models as well (on 2021-05-10). Note that the table below may show different results from those already reported, this may have been caused due to some specificity of the other evaluation scripts used. Model: jonatasgrosman/wav2vec2-large-xlsr-53-japanese, WER: 81.80%, CER: 20.16% Model: vumichien/wav2vec2-large-xlsr-japanese, WER: 1108.86%, CER: 23.40% Model: qqhann/w2v\_hf\_jsut\_xlsr53, WER: 1012.18%, CER: 70.77% If you want to cite this model you can use this:
[]
[ "TAGS\n#language-Japanese #license-apache-2.0 #audio #automatic-speech-recognition #speech #xlsr-fine-tuning-week #region-us \n" ]
45c670a3e879c1a0f47a9e0456d92b0088e1d7d8
# Dataset Card for "cauhoiphapluat_400tokenanswer" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
phamtungthuy/cauhoiphapluat_400tokenanswer
[ "region:us" ]
2024-01-11T05:43:22+00:00
{"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "field", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 16155075, "num_examples": 8808}, {"name": "test", "num_bytes": 32477322, "num_examples": 17616}, {"name": "train", "num_bytes": 113686598, "num_examples": 61684}], "download_size": 59968004, "dataset_size": 162318995}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}, {"split": "train", "path": "data/train-*"}]}]}
2024-01-11T05:44:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for "cauhoiphapluat_400tokenanswer" More Information needed
[ "# Dataset Card for \"cauhoiphapluat_400tokenanswer\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"cauhoiphapluat_400tokenanswer\"\n\nMore Information needed" ]
b04f84fc2a990ffb5e229e7cf67661ebd2014ca7
# Dataset Card for "JP_Morgan_Que_A" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
SayaliB/JP_Morgan_Que_A
[ "region:us" ]
2024-01-11T06:10:03+00:00
{"dataset_info": {"features": [{"name": "Question", "dtype": "string"}, {"name": "Answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 77111, "num_examples": 321}], "download_size": 36979, "dataset_size": 77111}}
2024-01-11T06:10:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for "JP_Morgan_Que_A" More Information needed
[ "# Dataset Card for \"JP_Morgan_Que_A\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"JP_Morgan_Que_A\"\n\nMore Information needed" ]
771ec252531a5c754c2037aaa3561f8489dcb0f7
This is a copy and unmaintained version of [`mlabonne/chatml_dpo_pairs`](https://huggingface.co/datasets/mlabonne/chatml_dpo_pairs) that we use in TRL CI for testing purpose. Please refer to the original dataset for usage and more details
trl-internal-testing/mlabonne-chatml-dpo-pairs-copy
[ "region:us" ]
2024-01-11T06:16:12+00:00
{"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 35914686, "num_examples": 12859}], "download_size": 19539812, "dataset_size": 35914686}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-11T06:17:53+00:00
[]
[]
TAGS #region-us
This is a copy and unmaintained version of 'mlabonne/chatml_dpo_pairs' that we use in TRL CI for testing purpose. Please refer to the original dataset for usage and more details
[]
[ "TAGS\n#region-us \n" ]
bb809237a01384d3024066bed053d09131239c76
# Dataset Card for Evaluation run of shitshow123/stablelm_sft_dpo <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [shitshow123/stablelm_sft_dpo](https://huggingface.co/shitshow123/stablelm_sft_dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_shitshow123__stablelm_sft_dpo", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T06:15:16.371728](https://huggingface.co/datasets/open-llm-leaderboard/details_shitshow123__stablelm_sft_dpo/blob/main/results_2024-01-11T06-15-16.371728.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.23196194129343728, "acc_stderr": 0.029934654752561563, "acc_norm": 0.2314240573187148, "acc_norm_stderr": 0.03071122006512167, "mc1": 1.0, "mc1_stderr": 0.0, "mc2": NaN, "mc2_stderr": NaN }, "harness|arc:challenge|25": { "acc": 0.22696245733788395, "acc_stderr": 0.012240491536132861, "acc_norm": 0.22696245733788395, "acc_norm_stderr": 0.012240491536132861 }, "harness|hellaswag|10": { "acc": 0.2504481179047998, "acc_stderr": 0.004323856300539177, "acc_norm": 0.2504481179047998, "acc_norm_stderr": 0.004323856300539177 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.18518518518518517, "acc_stderr": 0.03355677216313142, "acc_norm": 0.18518518518518517, "acc_norm_stderr": 0.03355677216313142 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21509433962264152, "acc_stderr": 0.02528839450289137, "acc_norm": 0.21509433962264152, "acc_norm_stderr": 0.02528839450289137 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749874, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749874 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102973, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102973 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813365, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813365 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.20899470899470898, "acc_stderr": 0.02094048156533486, "acc_norm": 0.20899470899470898, "acc_norm_stderr": 0.02094048156533486 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04040610178208841, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04040610178208841 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.1774193548387097, "acc_stderr": 0.02173254068932927, "acc_norm": 0.1774193548387097, "acc_norm_stderr": 0.02173254068932927 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.15270935960591134, "acc_stderr": 0.02530890453938063, "acc_norm": 0.15270935960591134, "acc_norm_stderr": 0.02530890453938063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.17676767676767677, "acc_stderr": 0.027178752639044915, "acc_norm": 0.17676767676767677, "acc_norm_stderr": 0.027178752639044915 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860664, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860664 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.20256410256410257, "acc_stderr": 0.020377660970371372, "acc_norm": 0.20256410256410257, "acc_norm_stderr": 0.020377660970371372 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2111111111111111, "acc_stderr": 0.024882116857655075, "acc_norm": 0.2111111111111111, "acc_norm_stderr": 0.024882116857655075 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21008403361344538, "acc_stderr": 0.026461398717471874, "acc_norm": 0.21008403361344538, "acc_norm_stderr": 0.026461398717471874 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.1986754966887417, "acc_stderr": 0.03257847384436776, "acc_norm": 0.1986754966887417, "acc_norm_stderr": 0.03257847384436776 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.1926605504587156, "acc_stderr": 0.016909276884936094, "acc_norm": 0.1926605504587156, "acc_norm_stderr": 0.016909276884936094 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.1527777777777778, "acc_stderr": 0.024536326026134224, "acc_norm": 0.1527777777777778, "acc_norm_stderr": 0.024536326026134224 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.31390134529147984, "acc_stderr": 0.031146796482972465, "acc_norm": 0.31390134529147984, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.032591773927421776, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.032591773927421776 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2905982905982906, "acc_stderr": 0.02974504857267404, "acc_norm": 0.2905982905982906, "acc_norm_stderr": 0.02974504857267404 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.23754789272030652, "acc_stderr": 0.015218733046150193, "acc_norm": 0.23754789272030652, "acc_norm_stderr": 0.015218733046150193 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24855491329479767, "acc_stderr": 0.023267528432100174, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.22549019607843138, "acc_stderr": 0.023929155517351284, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.023929155517351284 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.21604938271604937, "acc_stderr": 0.022899162918445806, "acc_norm": 0.21604938271604937, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.23404255319148937, "acc_stderr": 0.025257861359432417, "acc_norm": 0.23404255319148937, "acc_norm_stderr": 0.025257861359432417 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.18382352941176472, "acc_stderr": 0.023529242185193106, "acc_norm": 0.18382352941176472, "acc_norm_stderr": 0.023529242185193106 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25, "acc_stderr": 0.01751781884501444, "acc_norm": 0.25, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.18775510204081633, "acc_stderr": 0.02500025603954621, "acc_norm": 0.18775510204081633, "acc_norm_stderr": 0.02500025603954621 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401465, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401465 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.28313253012048195, "acc_stderr": 0.03507295431370518, "acc_norm": 0.28313253012048195, "acc_norm_stderr": 0.03507295431370518 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3216374269005848, "acc_stderr": 0.03582529442573122, "acc_norm": 0.3216374269005848, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 1.0, "mc1_stderr": 0.0, "mc2": NaN, "mc2_stderr": NaN }, "harness|winogrande|5": { "acc": 0.4956590370955012, "acc_stderr": 0.014051956064076911 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_shitshow123__stablelm_sft_dpo
[ "region:us" ]
2024-01-11T06:17:20+00:00
{"pretty_name": "Evaluation run of shitshow123/stablelm_sft_dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [shitshow123/stablelm_sft_dpo](https://huggingface.co/shitshow123/stablelm_sft_dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shitshow123__stablelm_sft_dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T06:15:16.371728](https://huggingface.co/datasets/open-llm-leaderboard/details_shitshow123__stablelm_sft_dpo/blob/main/results_2024-01-11T06-15-16.371728.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/shitshow123/stablelm_sft_dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|arc:challenge|25_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|gsm8k|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hellaswag|10_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T06-15-16.371728.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["**/details_harness|winogrande|5_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T06-15-16.371728.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T06_15_16.371728", "path": ["results_2024-01-11T06-15-16.371728.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T06-15-16.371728.parquet"]}]}]}
2024-01-11T06:17:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of shitshow123/stablelm_sft_dpo Dataset automatically created during the evaluation run of model shitshow123/stablelm_sft_dpo on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T06:15:16.371728(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of shitshow123/stablelm_sft_dpo\n\n\n\nDataset automatically created during the evaluation run of model shitshow123/stablelm_sft_dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T06:15:16.371728(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of shitshow123/stablelm_sft_dpo\n\n\n\nDataset automatically created during the evaluation run of model shitshow123/stablelm_sft_dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T06:15:16.371728(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
38695ea3a96c86f2f79bf62f5d38738ff3a54e33
# Dataset Card for Evaluation run of senseable/garten2-7b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [senseable/garten2-7b](https://huggingface.co/senseable/garten2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_senseable__garten2-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T06:51:24.778397](https://huggingface.co/datasets/open-llm-leaderboard/details_senseable__garten2-7b/blob/main/results_2024-01-11T06-51-24.778397.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6592274978159288, "acc_stderr": 0.03186938315275516, "acc_norm": 0.6587692648732468, "acc_norm_stderr": 0.03253666311812878, "mc1": 0.4320685434516524, "mc1_stderr": 0.017341202394988257, "mc2": 0.5949809262920956, "mc2_stderr": 0.015506243501904441 }, "harness|arc:challenge|25": { "acc": 0.6663822525597269, "acc_stderr": 0.013778687054176536, "acc_norm": 0.6936860068259386, "acc_norm_stderr": 0.013470584417276513 }, "harness|hellaswag|10": { "acc": 0.7076279625572595, "acc_stderr": 0.004539227260397024, "acc_norm": 0.8754232224656443, "acc_norm_stderr": 0.003295634907666466 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.0378272898086547, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7094339622641509, "acc_stderr": 0.027943219989337124, "acc_norm": 0.7094339622641509, "acc_norm_stderr": 0.027943219989337124 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6763005780346821, "acc_stderr": 0.0356760379963917, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.0356760379963917 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909284, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5263157894736842, "acc_stderr": 0.046970851366478626, "acc_norm": 0.5263157894736842, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.593103448275862, "acc_stderr": 0.04093793981266236, "acc_norm": 0.593103448275862, "acc_norm_stderr": 0.04093793981266236 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3994708994708995, "acc_stderr": 0.02522545028406788, "acc_norm": 0.3994708994708995, "acc_norm_stderr": 0.02522545028406788 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5317460317460317, "acc_stderr": 0.04463112720677173, "acc_norm": 0.5317460317460317, "acc_norm_stderr": 0.04463112720677173 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7806451612903226, "acc_stderr": 0.023540799358723295, "acc_norm": 0.7806451612903226, "acc_norm_stderr": 0.023540799358723295 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.031922715695483, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.031922715695483 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586808, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586808 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.917098445595855, "acc_stderr": 0.01989934131572178, "acc_norm": 0.917098445595855, "acc_norm_stderr": 0.01989934131572178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6743589743589744, "acc_stderr": 0.02375966576741229, "acc_norm": 0.6743589743589744, "acc_norm_stderr": 0.02375966576741229 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.028897748741131147, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.028897748741131147 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.030066761582977934, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.030066761582977934 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.038227469376587525, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.038227469376587525 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8568807339449541, "acc_stderr": 0.015014462497168589, "acc_norm": 0.8568807339449541, "acc_norm_stderr": 0.015014462497168589 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5555555555555556, "acc_stderr": 0.03388857118502325, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.03388857118502325 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8529411764705882, "acc_stderr": 0.024857478080250447, "acc_norm": 0.8529411764705882, "acc_norm_stderr": 0.024857478080250447 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8143459915611815, "acc_stderr": 0.025310495376944856, "acc_norm": 0.8143459915611815, "acc_norm_stderr": 0.025310495376944856 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7040358744394619, "acc_stderr": 0.030636591348699813, "acc_norm": 0.7040358744394619, "acc_norm_stderr": 0.030636591348699813 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.03498149385462472, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.03498149385462472 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098822, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098822 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742178, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742178 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.04738975119274155, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.04738975119274155 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822584, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822584 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406957, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406957 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.0446196043338474, "acc_norm": 0.73, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8275862068965517, "acc_stderr": 0.013507943909371803, "acc_norm": 0.8275862068965517, "acc_norm_stderr": 0.013507943909371803 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7572254335260116, "acc_stderr": 0.023083658586984204, "acc_norm": 0.7572254335260116, "acc_norm_stderr": 0.023083658586984204 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.46368715083798884, "acc_stderr": 0.016678341894533166, "acc_norm": 0.46368715083798884, "acc_norm_stderr": 0.016678341894533166 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7418300653594772, "acc_stderr": 0.025058503316958143, "acc_norm": 0.7418300653594772, "acc_norm_stderr": 0.025058503316958143 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.025922371788818763, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.025922371788818763 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4574468085106383, "acc_stderr": 0.029719281272236848, "acc_norm": 0.4574468085106383, "acc_norm_stderr": 0.029719281272236848 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47522816166883963, "acc_stderr": 0.012754553719781753, "acc_norm": 0.47522816166883963, "acc_norm_stderr": 0.012754553719781753 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6948529411764706, "acc_stderr": 0.027971541370170598, "acc_norm": 0.6948529411764706, "acc_norm_stderr": 0.027971541370170598 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6879084967320261, "acc_stderr": 0.018745011201277657, "acc_norm": 0.6879084967320261, "acc_norm_stderr": 0.018745011201277657 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8258706467661692, "acc_stderr": 0.026814951200421603, "acc_norm": 0.8258706467661692, "acc_norm_stderr": 0.026814951200421603 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5240963855421686, "acc_stderr": 0.03887971849597264, "acc_norm": 0.5240963855421686, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160882, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160882 }, "harness|truthfulqa:mc|0": { "mc1": 0.4320685434516524, "mc1_stderr": 0.017341202394988257, "mc2": 0.5949809262920956, "mc2_stderr": 0.015506243501904441 }, "harness|winogrande|5": { "acc": 0.8468823993685872, "acc_stderr": 0.010120623252272956 }, "harness|gsm8k|5": { "acc": 0.6937073540561031, "acc_stderr": 0.0126969301065629 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_senseable__garten2-7b
[ "region:us" ]
2024-01-11T06:53:42+00:00
{"pretty_name": "Evaluation run of senseable/garten2-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [senseable/garten2-7b](https://huggingface.co/senseable/garten2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_senseable__garten2-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T06:51:24.778397](https://huggingface.co/datasets/open-llm-leaderboard/details_senseable__garten2-7b/blob/main/results_2024-01-11T06-51-24.778397.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6592274978159288,\n \"acc_stderr\": 0.03186938315275516,\n \"acc_norm\": 0.6587692648732468,\n \"acc_norm_stderr\": 0.03253666311812878,\n \"mc1\": 0.4320685434516524,\n \"mc1_stderr\": 0.017341202394988257,\n \"mc2\": 0.5949809262920956,\n \"mc2_stderr\": 0.015506243501904441\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6663822525597269,\n \"acc_stderr\": 0.013778687054176536,\n \"acc_norm\": 0.6936860068259386,\n \"acc_norm_stderr\": 0.013470584417276513\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7076279625572595,\n \"acc_stderr\": 0.004539227260397024,\n \"acc_norm\": 0.8754232224656443,\n \"acc_norm_stderr\": 0.003295634907666466\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337124,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337124\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8568807339449541,\n \"acc_stderr\": 0.015014462497168589,\n \"acc_norm\": 0.8568807339449541,\n \"acc_norm_stderr\": 0.015014462497168589\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098822,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098822\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.04738975119274155,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.04738975119274155\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46368715083798884,\n \"acc_stderr\": 0.016678341894533166,\n \"acc_norm\": 0.46368715083798884,\n \"acc_norm_stderr\": 0.016678341894533166\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958143,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958143\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236848,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236848\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n \"acc_stderr\": 0.012754553719781753,\n \"acc_norm\": 0.47522816166883963,\n \"acc_norm_stderr\": 0.012754553719781753\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6879084967320261,\n \"acc_stderr\": 0.018745011201277657,\n \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.018745011201277657\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160882,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160882\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4320685434516524,\n \"mc1_stderr\": 0.017341202394988257,\n \"mc2\": 0.5949809262920956,\n \"mc2_stderr\": 0.015506243501904441\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272956\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6937073540561031,\n \"acc_stderr\": 0.0126969301065629\n }\n}\n```", "repo_url": "https://huggingface.co/senseable/garten2-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|arc:challenge|25_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|gsm8k|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hellaswag|10_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T06-51-24.778397.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["**/details_harness|winogrande|5_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T06-51-24.778397.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T06_51_24.778397", "path": ["results_2024-01-11T06-51-24.778397.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T06-51-24.778397.parquet"]}]}]}
2024-01-11T06:54:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of senseable/garten2-7b Dataset automatically created during the evaluation run of model senseable/garten2-7b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T06:51:24.778397(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of senseable/garten2-7b\n\n\n\nDataset automatically created during the evaluation run of model senseable/garten2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T06:51:24.778397(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of senseable/garten2-7b\n\n\n\nDataset automatically created during the evaluation run of model senseable/garten2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T06:51:24.778397(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
7f902382264a05620cc6c45da752a1ee0ac8361d
# Stable Diffusion web UI A browser interface based on Gradio library for Stable Diffusion. ![](screenshot.png) ## Features [Detailed feature showcase with images](https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Features): - Original txt2img and img2img modes - One click install and run script (but you still must install python and git) - Outpainting - Inpainting - Color Sketch - Prompt Matrix - Stable Diffusion Upscale - Attention, specify parts of text that the model should pay more attention to - a man in a `((tuxedo))` - will pay more attention to tuxedo - a man in a `(tuxedo:1.21)` - alternative syntax - select text and press `Ctrl+Up` or `Ctrl+Down` (or `Command+Up` or `Command+Down` if you're on a MacOS) to automatically adjust attention to selected text (code contributed by anonymous user) - Loopback, run img2img processing multiple times - X/Y/Z plot, a way to draw a 3 dimensional plot of images with different parameters - Textual Inversion - have as many embeddings as you want and use any names you like for them - use multiple embeddings with different numbers of vectors per token - works with half precision floating point numbers - train embeddings on 8GB (also reports of 6GB working) - Extras tab with: - GFPGAN, neural network that fixes faces - CodeFormer, face restoration tool as an alternative to GFPGAN - RealESRGAN, neural network upscaler - ESRGAN, neural network upscaler with a lot of third party models - SwinIR and Swin2SR ([see here](https://github.com/AUTOMATIC1111/stable-diffusion-webui/pull/2092)), neural network upscalers - LDSR, Latent diffusion super resolution upscaling - Resizing aspect ratio options - Sampling method selection - Adjust sampler eta values (noise multiplier) - More advanced noise setting options - Interrupt processing at any time - 4GB video card support (also reports of 2GB working) - Correct seeds for batches - Live prompt token length validation - Generation parameters - parameters you used to generate images are saved with that image - in PNG chunks for PNG, in EXIF for JPEG - can drag the image to PNG info tab to restore generation parameters and automatically copy them into UI - can be disabled in settings - drag and drop an image/text-parameters to promptbox - Read Generation Parameters Button, loads parameters in promptbox to UI - Settings page - Running arbitrary python code from UI (must run with `--allow-code` to enable) - Mouseover hints for most UI elements - Possible to change defaults/mix/max/step values for UI elements via text config - Tiling support, a checkbox to create images that can be tiled like textures - Progress bar and live image generation preview - Can use a separate neural network to produce previews with almost none VRAM or compute requirement - Negative prompt, an extra text field that allows you to list what you don't want to see in generated image - Styles, a way to save part of prompt and easily apply them via dropdown later - Variations, a way to generate same image but with tiny differences - Seed resizing, a way to generate same image but at slightly different resolution - CLIP interrogator, a button that tries to guess prompt from an image - Prompt Editing, a way to change prompt mid-generation, say to start making a watermelon and switch to anime girl midway - Batch Processing, process a group of files using img2img - Img2img Alternative, reverse Euler method of cross attention control - Highres Fix, a convenience option to produce high resolution pictures in one click without usual distortions - Reloading checkpoints on the fly - Checkpoint Merger, a tab that allows you to merge up to 3 checkpoints into one - [Custom scripts](https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Custom-Scripts) with many extensions from community - [Composable-Diffusion](https://energy-based-model.github.io/Compositional-Visual-Generation-with-Composable-Diffusion-Models/), a way to use multiple prompts at once - separate prompts using uppercase `AND` - also supports weights for prompts: `a cat :1.2 AND a dog AND a penguin :2.2` - No token limit for prompts (original stable diffusion lets you use up to 75 tokens) - DeepDanbooru integration, creates danbooru style tags for anime prompts - [xformers](https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Xformers), major speed increase for select cards: (add `--xformers` to commandline args) - via extension: [History tab](https://github.com/yfszzx/stable-diffusion-webui-images-browser): view, direct and delete images conveniently within the UI - Generate forever option - Training tab - hypernetworks and embeddings options - Preprocessing images: cropping, mirroring, autotagging using BLIP or deepdanbooru (for anime) - Clip skip - Hypernetworks - Loras (same as Hypernetworks but more pretty) - A separate UI where you can choose, with preview, which embeddings, hypernetworks or Loras to add to your prompt - Can select to load a different VAE from settings screen - Estimated completion time in progress bar - API - Support for dedicated [inpainting model](https://github.com/runwayml/stable-diffusion#inpainting-with-stable-diffusion) by RunwayML - via extension: [Aesthetic Gradients](https://github.com/AUTOMATIC1111/stable-diffusion-webui-aesthetic-gradients), a way to generate images with a specific aesthetic by using clip images embeds (implementation of [https://github.com/vicgalle/stable-diffusion-aesthetic-gradients](https://github.com/vicgalle/stable-diffusion-aesthetic-gradients)) - [Stable Diffusion 2.0](https://github.com/Stability-AI/stablediffusion) support - see [wiki](https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Features#stable-diffusion-20) for instructions - [Alt-Diffusion](https://arxiv.org/abs/2211.06679) support - see [wiki](https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Features#alt-diffusion) for instructions - Now without any bad letters! - Load checkpoints in safetensors format - Eased resolution restriction: generated image's dimensions must be a multiple of 8 rather than 64 - Now with a license! - Reorder elements in the UI from settings screen - [Segmind Stable Diffusion](https://huggingface.co/segmind/SSD-1B) support ## Installation and Running Make sure the required [dependencies](https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Dependencies) are met and follow the instructions available for: - [NVidia](https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Install-and-Run-on-NVidia-GPUs) (recommended) - [AMD](https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Install-and-Run-on-AMD-GPUs) GPUs. - [Intel CPUs, Intel GPUs (both integrated and discrete)](https://github.com/openvinotoolkit/stable-diffusion-webui/wiki/Installation-on-Intel-Silicon) (external wiki page) Alternatively, use online services (like Google Colab): - [List of Online Services](https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Online-Services) ### Installation on Windows 10/11 with NVidia-GPUs using release package 1. Download `sd.webui.zip` from [v1.0.0-pre](https://github.com/AUTOMATIC1111/stable-diffusion-webui/releases/tag/v1.0.0-pre) and extract its contents. 2. Run `update.bat`. 3. Run `run.bat`. > For more details see [Install-and-Run-on-NVidia-GPUs](https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Install-and-Run-on-NVidia-GPUs) ### Automatic Installation on Windows 1. Install [Python 3.10.6](https://www.python.org/downloads/release/python-3106/) (Newer version of Python does not support torch), checking "Add Python to PATH". 2. Install [git](https://git-scm.com/download/win). 3. Download the stable-diffusion-webui repository, for example by running `git clone https://github.com/AUTOMATIC1111/stable-diffusion-webui.git`. 4. Run `webui-user.bat` from Windows Explorer as normal, non-administrator, user. ### Automatic Installation on Linux 1. Install the dependencies: ```bash # Debian-based: sudo apt install wget git python3 python3-venv libgl1 libglib2.0-0 # Red Hat-based: sudo dnf install wget git python3 gperftools-libs libglvnd-glx # openSUSE-based: sudo zypper install wget git python3 libtcmalloc4 libglvnd # Arch-based: sudo pacman -S wget git python3 ``` 2. Navigate to the directory you would like the webui to be installed and execute the following command: ```bash wget -q https://raw.githubusercontent.com/AUTOMATIC1111/stable-diffusion-webui/master/webui.sh ``` 3. Run `webui.sh`. 4. Check `webui-user.sh` for options. ### Installation on Apple Silicon Find the instructions [here](https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Installation-on-Apple-Silicon). ## Contributing Here's how to add code to this repo: [Contributing](https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Contributing) ## Documentation The documentation was moved from this README over to the project's [wiki](https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki). For the purposes of getting Google and other search engines to crawl the wiki, here's a link to the (not for humans) [crawlable wiki](https://github-wiki-see.page/m/AUTOMATIC1111/stable-diffusion-webui/wiki). ## Credits Licenses for borrowed code can be found in `Settings -> Licenses` screen, and also in `html/licenses.html` file. - Stable Diffusion - https://github.com/Stability-AI/stablediffusion, https://github.com/CompVis/taming-transformers - k-diffusion - https://github.com/crowsonkb/k-diffusion.git - GFPGAN - https://github.com/TencentARC/GFPGAN.git - CodeFormer - https://github.com/sczhou/CodeFormer - ESRGAN - https://github.com/xinntao/ESRGAN - SwinIR - https://github.com/JingyunLiang/SwinIR - Swin2SR - https://github.com/mv-lab/swin2sr - LDSR - https://github.com/Hafiidz/latent-diffusion - MiDaS - https://github.com/isl-org/MiDaS - Ideas for optimizations - https://github.com/basujindal/stable-diffusion - Cross Attention layer optimization - Doggettx - https://github.com/Doggettx/stable-diffusion, original idea for prompt editing. - Cross Attention layer optimization - InvokeAI, lstein - https://github.com/invoke-ai/InvokeAI (originally http://github.com/lstein/stable-diffusion) - Sub-quadratic Cross Attention layer optimization - Alex Birch (https://github.com/Birch-san/diffusers/pull/1), Amin Rezaei (https://github.com/AminRezaei0x443/memory-efficient-attention) - Textual Inversion - Rinon Gal - https://github.com/rinongal/textual_inversion (we're not using his code, but we are using his ideas). - Idea for SD upscale - https://github.com/jquesnelle/txt2imghd - Noise generation for outpainting mk2 - https://github.com/parlance-zz/g-diffuser-bot - CLIP interrogator idea and borrowing some code - https://github.com/pharmapsychotic/clip-interrogator - Idea for Composable Diffusion - https://github.com/energy-based-model/Compositional-Visual-Generation-with-Composable-Diffusion-Models-PyTorch - xformers - https://github.com/facebookresearch/xformers - DeepDanbooru - interrogator for anime diffusers https://github.com/KichangKim/DeepDanbooru - Sampling in float32 precision from a float16 UNet - marunine for the idea, Birch-san for the example Diffusers implementation (https://github.com/Birch-san/diffusers-play/tree/92feee6) - Instruct pix2pix - Tim Brooks (star), Aleksander Holynski (star), Alexei A. Efros (no star) - https://github.com/timothybrooks/instruct-pix2pix - Security advice - RyotaK - UniPC sampler - Wenliang Zhao - https://github.com/wl-zhao/UniPC - TAESD - Ollin Boer Bohan - https://github.com/madebyollin/taesd - LyCORIS - KohakuBlueleaf - Restart sampling - lambertae - https://github.com/Newbeeer/diffusion_restart_sampling - Hypertile - tfernd - https://github.com/tfernd/HyperTile - Initial Gradio script - posted on 4chan by an Anonymous user. Thank you Anonymous user. - (You)
PennyJX/stable-diffusion-webui
[ "arxiv:2211.06679", "region:us" ]
2024-01-11T06:56:24+00:00
{}
2024-01-12T08:01:28+00:00
[ "2211.06679" ]
[]
TAGS #arxiv-2211.06679 #region-us
# Stable Diffusion web UI A browser interface based on Gradio library for Stable Diffusion. ![](URL) ## Features Detailed feature showcase with images: - Original txt2img and img2img modes - One click install and run script (but you still must install python and git) - Outpainting - Inpainting - Color Sketch - Prompt Matrix - Stable Diffusion Upscale - Attention, specify parts of text that the model should pay more attention to - a man in a '((tuxedo))' - will pay more attention to tuxedo - a man in a '(tuxedo:1.21)' - alternative syntax - select text and press 'Ctrl+Up' or 'Ctrl+Down' (or 'Command+Up' or 'Command+Down' if you're on a MacOS) to automatically adjust attention to selected text (code contributed by anonymous user) - Loopback, run img2img processing multiple times - X/Y/Z plot, a way to draw a 3 dimensional plot of images with different parameters - Textual Inversion - have as many embeddings as you want and use any names you like for them - use multiple embeddings with different numbers of vectors per token - works with half precision floating point numbers - train embeddings on 8GB (also reports of 6GB working) - Extras tab with: - GFPGAN, neural network that fixes faces - CodeFormer, face restoration tool as an alternative to GFPGAN - RealESRGAN, neural network upscaler - ESRGAN, neural network upscaler with a lot of third party models - SwinIR and Swin2SR (see here), neural network upscalers - LDSR, Latent diffusion super resolution upscaling - Resizing aspect ratio options - Sampling method selection - Adjust sampler eta values (noise multiplier) - More advanced noise setting options - Interrupt processing at any time - 4GB video card support (also reports of 2GB working) - Correct seeds for batches - Live prompt token length validation - Generation parameters - parameters you used to generate images are saved with that image - in PNG chunks for PNG, in EXIF for JPEG - can drag the image to PNG info tab to restore generation parameters and automatically copy them into UI - can be disabled in settings - drag and drop an image/text-parameters to promptbox - Read Generation Parameters Button, loads parameters in promptbox to UI - Settings page - Running arbitrary python code from UI (must run with '--allow-code' to enable) - Mouseover hints for most UI elements - Possible to change defaults/mix/max/step values for UI elements via text config - Tiling support, a checkbox to create images that can be tiled like textures - Progress bar and live image generation preview - Can use a separate neural network to produce previews with almost none VRAM or compute requirement - Negative prompt, an extra text field that allows you to list what you don't want to see in generated image - Styles, a way to save part of prompt and easily apply them via dropdown later - Variations, a way to generate same image but with tiny differences - Seed resizing, a way to generate same image but at slightly different resolution - CLIP interrogator, a button that tries to guess prompt from an image - Prompt Editing, a way to change prompt mid-generation, say to start making a watermelon and switch to anime girl midway - Batch Processing, process a group of files using img2img - Img2img Alternative, reverse Euler method of cross attention control - Highres Fix, a convenience option to produce high resolution pictures in one click without usual distortions - Reloading checkpoints on the fly - Checkpoint Merger, a tab that allows you to merge up to 3 checkpoints into one - Custom scripts with many extensions from community - Composable-Diffusion, a way to use multiple prompts at once - separate prompts using uppercase 'AND' - also supports weights for prompts: 'a cat :1.2 AND a dog AND a penguin :2.2' - No token limit for prompts (original stable diffusion lets you use up to 75 tokens) - DeepDanbooru integration, creates danbooru style tags for anime prompts - xformers, major speed increase for select cards: (add '--xformers' to commandline args) - via extension: History tab: view, direct and delete images conveniently within the UI - Generate forever option - Training tab - hypernetworks and embeddings options - Preprocessing images: cropping, mirroring, autotagging using BLIP or deepdanbooru (for anime) - Clip skip - Hypernetworks - Loras (same as Hypernetworks but more pretty) - A separate UI where you can choose, with preview, which embeddings, hypernetworks or Loras to add to your prompt - Can select to load a different VAE from settings screen - Estimated completion time in progress bar - API - Support for dedicated inpainting model by RunwayML - via extension: Aesthetic Gradients, a way to generate images with a specific aesthetic by using clip images embeds (implementation of URL - Stable Diffusion 2.0 support - see wiki for instructions - Alt-Diffusion support - see wiki for instructions - Now without any bad letters! - Load checkpoints in safetensors format - Eased resolution restriction: generated image's dimensions must be a multiple of 8 rather than 64 - Now with a license! - Reorder elements in the UI from settings screen - Segmind Stable Diffusion support ## Installation and Running Make sure the required dependencies are met and follow the instructions available for: - NVidia (recommended) - AMD GPUs. - Intel CPUs, Intel GPUs (both integrated and discrete) (external wiki page) Alternatively, use online services (like Google Colab): - List of Online Services ### Installation on Windows 10/11 with NVidia-GPUs using release package 1. Download 'URL' from v1.0.0-pre and extract its contents. 2. Run 'URL'. 3. Run 'URL'. > For more details see Install-and-Run-on-NVidia-GPUs ### Automatic Installation on Windows 1. Install Python 3.10.6 (Newer version of Python does not support torch), checking "Add Python to PATH". 2. Install git. 3. Download the stable-diffusion-webui repository, for example by running 'git clone URL 4. Run 'URL' from Windows Explorer as normal, non-administrator, user. ### Automatic Installation on Linux 1. Install the dependencies: 2. Navigate to the directory you would like the webui to be installed and execute the following command: 3. Run 'URL'. 4. Check 'URL' for options. ### Installation on Apple Silicon Find the instructions here. ## Contributing Here's how to add code to this repo: Contributing ## Documentation The documentation was moved from this README over to the project's wiki. For the purposes of getting Google and other search engines to crawl the wiki, here's a link to the (not for humans) crawlable wiki. ## Credits Licenses for borrowed code can be found in 'Settings -> Licenses' screen, and also in 'html/URL' file. - Stable Diffusion - URL URL - k-diffusion - URL - GFPGAN - URL - CodeFormer - URL - ESRGAN - URL - SwinIR - URL - Swin2SR - URL - LDSR - URL - MiDaS - URL - Ideas for optimizations - URL - Cross Attention layer optimization - Doggettx - URL original idea for prompt editing. - Cross Attention layer optimization - InvokeAI, lstein - URL (originally URL - Sub-quadratic Cross Attention layer optimization - Alex Birch (URL Amin Rezaei (URL - Textual Inversion - Rinon Gal - URL (we're not using his code, but we are using his ideas). - Idea for SD upscale - URL - Noise generation for outpainting mk2 - URL - CLIP interrogator idea and borrowing some code - URL - Idea for Composable Diffusion - URL - xformers - URL - DeepDanbooru - interrogator for anime diffusers URL - Sampling in float32 precision from a float16 UNet - marunine for the idea, Birch-san for the example Diffusers implementation (URL - Instruct pix2pix - Tim Brooks (star), Aleksander Holynski (star), Alexei A. Efros (no star) - URL - Security advice - RyotaK - UniPC sampler - Wenliang Zhao - URL - TAESD - Ollin Boer Bohan - URL - LyCORIS - KohakuBlueleaf - Restart sampling - lambertae - URL - Hypertile - tfernd - URL - Initial Gradio script - posted on 4chan by an Anonymous user. Thank you Anonymous user. - (You)
[ "# Stable Diffusion web UI\nA browser interface based on Gradio library for Stable Diffusion.\n\n![](URL)", "## Features\nDetailed feature showcase with images:\n- Original txt2img and img2img modes\n- One click install and run script (but you still must install python and git)\n- Outpainting\n- Inpainting\n- Color Sketch\n- Prompt Matrix\n- Stable Diffusion Upscale\n- Attention, specify parts of text that the model should pay more attention to\n - a man in a '((tuxedo))' - will pay more attention to tuxedo\n - a man in a '(tuxedo:1.21)' - alternative syntax\n - select text and press 'Ctrl+Up' or 'Ctrl+Down' (or 'Command+Up' or 'Command+Down' if you're on a MacOS) to automatically adjust attention to selected text (code contributed by anonymous user)\n- Loopback, run img2img processing multiple times\n- X/Y/Z plot, a way to draw a 3 dimensional plot of images with different parameters\n- Textual Inversion\n - have as many embeddings as you want and use any names you like for them\n - use multiple embeddings with different numbers of vectors per token\n - works with half precision floating point numbers\n - train embeddings on 8GB (also reports of 6GB working)\n- Extras tab with:\n - GFPGAN, neural network that fixes faces\n - CodeFormer, face restoration tool as an alternative to GFPGAN\n - RealESRGAN, neural network upscaler\n - ESRGAN, neural network upscaler with a lot of third party models\n - SwinIR and Swin2SR (see here), neural network upscalers\n - LDSR, Latent diffusion super resolution upscaling\n- Resizing aspect ratio options\n- Sampling method selection\n - Adjust sampler eta values (noise multiplier)\n - More advanced noise setting options\n- Interrupt processing at any time\n- 4GB video card support (also reports of 2GB working)\n- Correct seeds for batches\n- Live prompt token length validation\n- Generation parameters\n - parameters you used to generate images are saved with that image\n - in PNG chunks for PNG, in EXIF for JPEG\n - can drag the image to PNG info tab to restore generation parameters and automatically copy them into UI\n - can be disabled in settings\n - drag and drop an image/text-parameters to promptbox\n- Read Generation Parameters Button, loads parameters in promptbox to UI\n- Settings page\n- Running arbitrary python code from UI (must run with '--allow-code' to enable)\n- Mouseover hints for most UI elements\n- Possible to change defaults/mix/max/step values for UI elements via text config\n- Tiling support, a checkbox to create images that can be tiled like textures\n- Progress bar and live image generation preview\n - Can use a separate neural network to produce previews with almost none VRAM or compute requirement\n- Negative prompt, an extra text field that allows you to list what you don't want to see in generated image\n- Styles, a way to save part of prompt and easily apply them via dropdown later\n- Variations, a way to generate same image but with tiny differences\n- Seed resizing, a way to generate same image but at slightly different resolution\n- CLIP interrogator, a button that tries to guess prompt from an image\n- Prompt Editing, a way to change prompt mid-generation, say to start making a watermelon and switch to anime girl midway\n- Batch Processing, process a group of files using img2img\n- Img2img Alternative, reverse Euler method of cross attention control\n- Highres Fix, a convenience option to produce high resolution pictures in one click without usual distortions\n- Reloading checkpoints on the fly\n- Checkpoint Merger, a tab that allows you to merge up to 3 checkpoints into one\n- Custom scripts with many extensions from community\n- Composable-Diffusion, a way to use multiple prompts at once\n - separate prompts using uppercase 'AND'\n - also supports weights for prompts: 'a cat :1.2 AND a dog AND a penguin :2.2'\n- No token limit for prompts (original stable diffusion lets you use up to 75 tokens)\n- DeepDanbooru integration, creates danbooru style tags for anime prompts\n- xformers, major speed increase for select cards: (add '--xformers' to commandline args)\n- via extension: History tab: view, direct and delete images conveniently within the UI\n- Generate forever option\n- Training tab\n - hypernetworks and embeddings options\n - Preprocessing images: cropping, mirroring, autotagging using BLIP or deepdanbooru (for anime)\n- Clip skip\n- Hypernetworks\n- Loras (same as Hypernetworks but more pretty)\n- A separate UI where you can choose, with preview, which embeddings, hypernetworks or Loras to add to your prompt \n- Can select to load a different VAE from settings screen\n- Estimated completion time in progress bar\n- API\n- Support for dedicated inpainting model by RunwayML\n- via extension: Aesthetic Gradients, a way to generate images with a specific aesthetic by using clip images embeds (implementation of URL\n- Stable Diffusion 2.0 support - see wiki for instructions\n- Alt-Diffusion support - see wiki for instructions\n- Now without any bad letters!\n- Load checkpoints in safetensors format\n- Eased resolution restriction: generated image's dimensions must be a multiple of 8 rather than 64\n- Now with a license!\n- Reorder elements in the UI from settings screen\n- Segmind Stable Diffusion support", "## Installation and Running\nMake sure the required dependencies are met and follow the instructions available for:\n- NVidia (recommended)\n- AMD GPUs.\n- Intel CPUs, Intel GPUs (both integrated and discrete) (external wiki page)\n\nAlternatively, use online services (like Google Colab):\n\n- List of Online Services", "### Installation on Windows 10/11 with NVidia-GPUs using release package\n1. Download 'URL' from v1.0.0-pre and extract its contents.\n2. Run 'URL'.\n3. Run 'URL'.\n> For more details see Install-and-Run-on-NVidia-GPUs", "### Automatic Installation on Windows\n1. Install Python 3.10.6 (Newer version of Python does not support torch), checking \"Add Python to PATH\".\n2. Install git.\n3. Download the stable-diffusion-webui repository, for example by running 'git clone URL\n4. Run 'URL' from Windows Explorer as normal, non-administrator, user.", "### Automatic Installation on Linux\n1. Install the dependencies:\n\n2. Navigate to the directory you would like the webui to be installed and execute the following command:\n\n3. Run 'URL'.\n4. Check 'URL' for options.", "### Installation on Apple Silicon\n\nFind the instructions here.", "## Contributing\nHere's how to add code to this repo: Contributing", "## Documentation\n\nThe documentation was moved from this README over to the project's wiki.\n\nFor the purposes of getting Google and other search engines to crawl the wiki, here's a link to the (not for humans) crawlable wiki.", "## Credits\nLicenses for borrowed code can be found in 'Settings -> Licenses' screen, and also in 'html/URL' file.\n\n- Stable Diffusion - URL URL\n- k-diffusion - URL\n- GFPGAN - URL\n- CodeFormer - URL\n- ESRGAN - URL\n- SwinIR - URL\n- Swin2SR - URL\n- LDSR - URL\n- MiDaS - URL\n- Ideas for optimizations - URL\n- Cross Attention layer optimization - Doggettx - URL original idea for prompt editing.\n- Cross Attention layer optimization - InvokeAI, lstein - URL (originally URL\n- Sub-quadratic Cross Attention layer optimization - Alex Birch (URL Amin Rezaei (URL\n- Textual Inversion - Rinon Gal - URL (we're not using his code, but we are using his ideas).\n- Idea for SD upscale - URL\n- Noise generation for outpainting mk2 - URL\n- CLIP interrogator idea and borrowing some code - URL\n- Idea for Composable Diffusion - URL\n- xformers - URL\n- DeepDanbooru - interrogator for anime diffusers URL\n- Sampling in float32 precision from a float16 UNet - marunine for the idea, Birch-san for the example Diffusers implementation (URL\n- Instruct pix2pix - Tim Brooks (star), Aleksander Holynski (star), Alexei A. Efros (no star) - URL\n- Security advice - RyotaK\n- UniPC sampler - Wenliang Zhao - URL\n- TAESD - Ollin Boer Bohan - URL\n- LyCORIS - KohakuBlueleaf\n- Restart sampling - lambertae - URL\n- Hypertile - tfernd - URL\n- Initial Gradio script - posted on 4chan by an Anonymous user. Thank you Anonymous user.\n- (You)" ]
[ "TAGS\n#arxiv-2211.06679 #region-us \n", "# Stable Diffusion web UI\nA browser interface based on Gradio library for Stable Diffusion.\n\n![](URL)", "## Features\nDetailed feature showcase with images:\n- Original txt2img and img2img modes\n- One click install and run script (but you still must install python and git)\n- Outpainting\n- Inpainting\n- Color Sketch\n- Prompt Matrix\n- Stable Diffusion Upscale\n- Attention, specify parts of text that the model should pay more attention to\n - a man in a '((tuxedo))' - will pay more attention to tuxedo\n - a man in a '(tuxedo:1.21)' - alternative syntax\n - select text and press 'Ctrl+Up' or 'Ctrl+Down' (or 'Command+Up' or 'Command+Down' if you're on a MacOS) to automatically adjust attention to selected text (code contributed by anonymous user)\n- Loopback, run img2img processing multiple times\n- X/Y/Z plot, a way to draw a 3 dimensional plot of images with different parameters\n- Textual Inversion\n - have as many embeddings as you want and use any names you like for them\n - use multiple embeddings with different numbers of vectors per token\n - works with half precision floating point numbers\n - train embeddings on 8GB (also reports of 6GB working)\n- Extras tab with:\n - GFPGAN, neural network that fixes faces\n - CodeFormer, face restoration tool as an alternative to GFPGAN\n - RealESRGAN, neural network upscaler\n - ESRGAN, neural network upscaler with a lot of third party models\n - SwinIR and Swin2SR (see here), neural network upscalers\n - LDSR, Latent diffusion super resolution upscaling\n- Resizing aspect ratio options\n- Sampling method selection\n - Adjust sampler eta values (noise multiplier)\n - More advanced noise setting options\n- Interrupt processing at any time\n- 4GB video card support (also reports of 2GB working)\n- Correct seeds for batches\n- Live prompt token length validation\n- Generation parameters\n - parameters you used to generate images are saved with that image\n - in PNG chunks for PNG, in EXIF for JPEG\n - can drag the image to PNG info tab to restore generation parameters and automatically copy them into UI\n - can be disabled in settings\n - drag and drop an image/text-parameters to promptbox\n- Read Generation Parameters Button, loads parameters in promptbox to UI\n- Settings page\n- Running arbitrary python code from UI (must run with '--allow-code' to enable)\n- Mouseover hints for most UI elements\n- Possible to change defaults/mix/max/step values for UI elements via text config\n- Tiling support, a checkbox to create images that can be tiled like textures\n- Progress bar and live image generation preview\n - Can use a separate neural network to produce previews with almost none VRAM or compute requirement\n- Negative prompt, an extra text field that allows you to list what you don't want to see in generated image\n- Styles, a way to save part of prompt and easily apply them via dropdown later\n- Variations, a way to generate same image but with tiny differences\n- Seed resizing, a way to generate same image but at slightly different resolution\n- CLIP interrogator, a button that tries to guess prompt from an image\n- Prompt Editing, a way to change prompt mid-generation, say to start making a watermelon and switch to anime girl midway\n- Batch Processing, process a group of files using img2img\n- Img2img Alternative, reverse Euler method of cross attention control\n- Highres Fix, a convenience option to produce high resolution pictures in one click without usual distortions\n- Reloading checkpoints on the fly\n- Checkpoint Merger, a tab that allows you to merge up to 3 checkpoints into one\n- Custom scripts with many extensions from community\n- Composable-Diffusion, a way to use multiple prompts at once\n - separate prompts using uppercase 'AND'\n - also supports weights for prompts: 'a cat :1.2 AND a dog AND a penguin :2.2'\n- No token limit for prompts (original stable diffusion lets you use up to 75 tokens)\n- DeepDanbooru integration, creates danbooru style tags for anime prompts\n- xformers, major speed increase for select cards: (add '--xformers' to commandline args)\n- via extension: History tab: view, direct and delete images conveniently within the UI\n- Generate forever option\n- Training tab\n - hypernetworks and embeddings options\n - Preprocessing images: cropping, mirroring, autotagging using BLIP or deepdanbooru (for anime)\n- Clip skip\n- Hypernetworks\n- Loras (same as Hypernetworks but more pretty)\n- A separate UI where you can choose, with preview, which embeddings, hypernetworks or Loras to add to your prompt \n- Can select to load a different VAE from settings screen\n- Estimated completion time in progress bar\n- API\n- Support for dedicated inpainting model by RunwayML\n- via extension: Aesthetic Gradients, a way to generate images with a specific aesthetic by using clip images embeds (implementation of URL\n- Stable Diffusion 2.0 support - see wiki for instructions\n- Alt-Diffusion support - see wiki for instructions\n- Now without any bad letters!\n- Load checkpoints in safetensors format\n- Eased resolution restriction: generated image's dimensions must be a multiple of 8 rather than 64\n- Now with a license!\n- Reorder elements in the UI from settings screen\n- Segmind Stable Diffusion support", "## Installation and Running\nMake sure the required dependencies are met and follow the instructions available for:\n- NVidia (recommended)\n- AMD GPUs.\n- Intel CPUs, Intel GPUs (both integrated and discrete) (external wiki page)\n\nAlternatively, use online services (like Google Colab):\n\n- List of Online Services", "### Installation on Windows 10/11 with NVidia-GPUs using release package\n1. Download 'URL' from v1.0.0-pre and extract its contents.\n2. Run 'URL'.\n3. Run 'URL'.\n> For more details see Install-and-Run-on-NVidia-GPUs", "### Automatic Installation on Windows\n1. Install Python 3.10.6 (Newer version of Python does not support torch), checking \"Add Python to PATH\".\n2. Install git.\n3. Download the stable-diffusion-webui repository, for example by running 'git clone URL\n4. Run 'URL' from Windows Explorer as normal, non-administrator, user.", "### Automatic Installation on Linux\n1. Install the dependencies:\n\n2. Navigate to the directory you would like the webui to be installed and execute the following command:\n\n3. Run 'URL'.\n4. Check 'URL' for options.", "### Installation on Apple Silicon\n\nFind the instructions here.", "## Contributing\nHere's how to add code to this repo: Contributing", "## Documentation\n\nThe documentation was moved from this README over to the project's wiki.\n\nFor the purposes of getting Google and other search engines to crawl the wiki, here's a link to the (not for humans) crawlable wiki.", "## Credits\nLicenses for borrowed code can be found in 'Settings -> Licenses' screen, and also in 'html/URL' file.\n\n- Stable Diffusion - URL URL\n- k-diffusion - URL\n- GFPGAN - URL\n- CodeFormer - URL\n- ESRGAN - URL\n- SwinIR - URL\n- Swin2SR - URL\n- LDSR - URL\n- MiDaS - URL\n- Ideas for optimizations - URL\n- Cross Attention layer optimization - Doggettx - URL original idea for prompt editing.\n- Cross Attention layer optimization - InvokeAI, lstein - URL (originally URL\n- Sub-quadratic Cross Attention layer optimization - Alex Birch (URL Amin Rezaei (URL\n- Textual Inversion - Rinon Gal - URL (we're not using his code, but we are using his ideas).\n- Idea for SD upscale - URL\n- Noise generation for outpainting mk2 - URL\n- CLIP interrogator idea and borrowing some code - URL\n- Idea for Composable Diffusion - URL\n- xformers - URL\n- DeepDanbooru - interrogator for anime diffusers URL\n- Sampling in float32 precision from a float16 UNet - marunine for the idea, Birch-san for the example Diffusers implementation (URL\n- Instruct pix2pix - Tim Brooks (star), Aleksander Holynski (star), Alexei A. Efros (no star) - URL\n- Security advice - RyotaK\n- UniPC sampler - Wenliang Zhao - URL\n- TAESD - Ollin Boer Bohan - URL\n- LyCORIS - KohakuBlueleaf\n- Restart sampling - lambertae - URL\n- Hypertile - tfernd - URL\n- Initial Gradio script - posted on 4chan by an Anonymous user. Thank you Anonymous user.\n- (You)" ]
a83ab6c30c61dbbfb52d758c8e073d68d72c367b
# Dataset Card for Evaluation run of AA051611/A0110 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [AA051611/A0110](https://huggingface.co/AA051611/A0110) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_AA051611__A0110", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T07:22:47.294306](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__A0110/blob/main/results_2024-01-11T07-22-47.294306.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7412573048084544, "acc_stderr": 0.028863410665461813, "acc_norm": 0.745188483421092, "acc_norm_stderr": 0.029413193311414305, "mc1": 0.397796817625459, "mc1_stderr": 0.017133934248559635, "mc2": 0.5860453215820967, "mc2_stderr": 0.015209324923113767 }, "harness|arc:challenge|25": { "acc": 0.6356655290102389, "acc_stderr": 0.014063260279882417, "acc_norm": 0.6638225255972696, "acc_norm_stderr": 0.013804855026205761 }, "harness|hellaswag|10": { "acc": 0.6546504680342561, "acc_stderr": 0.0047451035439012934, "acc_norm": 0.8473411670981876, "acc_norm_stderr": 0.003589232889306521 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7185185185185186, "acc_stderr": 0.038850042458002526, "acc_norm": 0.7185185185185186, "acc_norm_stderr": 0.038850042458002526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.868421052631579, "acc_stderr": 0.027508689533549905, "acc_norm": 0.868421052631579, "acc_norm_stderr": 0.027508689533549905 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7924528301886793, "acc_stderr": 0.024959918028911274, "acc_norm": 0.7924528301886793, "acc_norm_stderr": 0.024959918028911274 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8194444444444444, "acc_stderr": 0.032166008088022675, "acc_norm": 0.8194444444444444, "acc_norm_stderr": 0.032166008088022675 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.62, "acc_stderr": 0.04878317312145632, "acc_norm": 0.62, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7109826589595376, "acc_stderr": 0.03456425745086999, "acc_norm": 0.7109826589595376, "acc_norm_stderr": 0.03456425745086999 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.47058823529411764, "acc_stderr": 0.04966570903978529, "acc_norm": 0.47058823529411764, "acc_norm_stderr": 0.04966570903978529 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816505, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7617021276595745, "acc_stderr": 0.027851252973889788, "acc_norm": 0.7617021276595745, "acc_norm_stderr": 0.027851252973889788 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5789473684210527, "acc_stderr": 0.046446020912223177, "acc_norm": 0.5789473684210527, "acc_norm_stderr": 0.046446020912223177 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7310344827586207, "acc_stderr": 0.036951833116502325, "acc_norm": 0.7310344827586207, "acc_norm_stderr": 0.036951833116502325 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.626984126984127, "acc_stderr": 0.02490699045899257, "acc_norm": 0.626984126984127, "acc_norm_stderr": 0.02490699045899257 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5396825396825397, "acc_stderr": 0.04458029125470973, "acc_norm": 0.5396825396825397, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8774193548387097, "acc_stderr": 0.018656720991789406, "acc_norm": 0.8774193548387097, "acc_norm_stderr": 0.018656720991789406 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5862068965517241, "acc_stderr": 0.03465304488406795, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.03465304488406795 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.77, "acc_stderr": 0.04229525846816505, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8424242424242424, "acc_stderr": 0.028450388805284357, "acc_norm": 0.8424242424242424, "acc_norm_stderr": 0.028450388805284357 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8888888888888888, "acc_stderr": 0.02239078763821677, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.02239078763821677 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9637305699481865, "acc_stderr": 0.013492659751295145, "acc_norm": 0.9637305699481865, "acc_norm_stderr": 0.013492659751295145 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.8, "acc_stderr": 0.020280805062535722, "acc_norm": 0.8, "acc_norm_stderr": 0.020280805062535722 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.4074074074074074, "acc_stderr": 0.02995824925008211, "acc_norm": 0.4074074074074074, "acc_norm_stderr": 0.02995824925008211 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8403361344537815, "acc_stderr": 0.0237933539975288, "acc_norm": 0.8403361344537815, "acc_norm_stderr": 0.0237933539975288 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.45695364238410596, "acc_stderr": 0.04067325174247443, "acc_norm": 0.45695364238410596, "acc_norm_stderr": 0.04067325174247443 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9064220183486239, "acc_stderr": 0.012486841824601963, "acc_norm": 0.9064220183486239, "acc_norm_stderr": 0.012486841824601963 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.0321495214780275, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.0321495214780275 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9117647058823529, "acc_stderr": 0.019907399791316942, "acc_norm": 0.9117647058823529, "acc_norm_stderr": 0.019907399791316942 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8987341772151899, "acc_stderr": 0.019637720526065515, "acc_norm": 0.8987341772151899, "acc_norm_stderr": 0.019637720526065515 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7713004484304933, "acc_stderr": 0.028188240046929203, "acc_norm": 0.7713004484304933, "acc_norm_stderr": 0.028188240046929203 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8854961832061069, "acc_stderr": 0.027927473753597453, "acc_norm": 0.8854961832061069, "acc_norm_stderr": 0.027927473753597453 }, "harness|hendrycksTest-international_law|5": { "acc": 0.9008264462809917, "acc_stderr": 0.02728524631275896, "acc_norm": 0.9008264462809917, "acc_norm_stderr": 0.02728524631275896 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8888888888888888, "acc_stderr": 0.030381596756651655, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.030381596756651655 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8404907975460123, "acc_stderr": 0.028767481725983878, "acc_norm": 0.8404907975460123, "acc_norm_stderr": 0.028767481725983878 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5892857142857143, "acc_stderr": 0.04669510663875191, "acc_norm": 0.5892857142857143, "acc_norm_stderr": 0.04669510663875191 }, "harness|hendrycksTest-management|5": { "acc": 0.912621359223301, "acc_stderr": 0.027960689125970654, "acc_norm": 0.912621359223301, "acc_norm_stderr": 0.027960689125970654 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9358974358974359, "acc_stderr": 0.01604626163167314, "acc_norm": 0.9358974358974359, "acc_norm_stderr": 0.01604626163167314 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9042145593869731, "acc_stderr": 0.010524031079055826, "acc_norm": 0.9042145593869731, "acc_norm_stderr": 0.010524031079055826 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8005780346820809, "acc_stderr": 0.021511900654252555, "acc_norm": 0.8005780346820809, "acc_norm_stderr": 0.021511900654252555 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.6189944134078212, "acc_stderr": 0.016242028834053613, "acc_norm": 0.6189944134078212, "acc_norm_stderr": 0.016242028834053613 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.8071895424836601, "acc_stderr": 0.02258931888817669, "acc_norm": 0.8071895424836601, "acc_norm_stderr": 0.02258931888817669 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8295819935691319, "acc_stderr": 0.021355343028264043, "acc_norm": 0.8295819935691319, "acc_norm_stderr": 0.021355343028264043 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8302469135802469, "acc_stderr": 0.02088869041409387, "acc_norm": 0.8302469135802469, "acc_norm_stderr": 0.02088869041409387 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6063829787234043, "acc_stderr": 0.02914454478159616, "acc_norm": 0.6063829787234043, "acc_norm_stderr": 0.02914454478159616 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5573663624511083, "acc_stderr": 0.012685906538206237, "acc_norm": 0.5573663624511083, "acc_norm_stderr": 0.012685906538206237 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8235294117647058, "acc_stderr": 0.023157468308559345, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.023157468308559345 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7924836601307189, "acc_stderr": 0.016405924270103234, "acc_norm": 0.7924836601307189, "acc_norm_stderr": 0.016405924270103234 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7181818181818181, "acc_stderr": 0.04309118709946458, "acc_norm": 0.7181818181818181, "acc_norm_stderr": 0.04309118709946458 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8081632653061225, "acc_stderr": 0.0252069631542254, "acc_norm": 0.8081632653061225, "acc_norm_stderr": 0.0252069631542254 }, "harness|hendrycksTest-sociology|5": { "acc": 0.900497512437811, "acc_stderr": 0.021166216304659393, "acc_norm": 0.900497512437811, "acc_norm_stderr": 0.021166216304659393 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.94, "acc_stderr": 0.023868325657594173, "acc_norm": 0.94, "acc_norm_stderr": 0.023868325657594173 }, "harness|hendrycksTest-virology|5": { "acc": 0.572289156626506, "acc_stderr": 0.038515976837185335, "acc_norm": 0.572289156626506, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8947368421052632, "acc_stderr": 0.02353755765789256, "acc_norm": 0.8947368421052632, "acc_norm_stderr": 0.02353755765789256 }, "harness|truthfulqa:mc|0": { "mc1": 0.397796817625459, "mc1_stderr": 0.017133934248559635, "mc2": 0.5860453215820967, "mc2_stderr": 0.015209324923113767 }, "harness|winogrande|5": { "acc": 0.8232044198895028, "acc_stderr": 0.010721923287918739 }, "harness|gsm8k|5": { "acc": 0.6482183472327521, "acc_stderr": 0.013153446023536039 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_AA051611__A0110
[ "region:us" ]
2024-01-11T07:25:34+00:00
{"pretty_name": "Evaluation run of AA051611/A0110", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051611/A0110](https://huggingface.co/AA051611/A0110) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051611__A0110\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T07:22:47.294306](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__A0110/blob/main/results_2024-01-11T07-22-47.294306.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7412573048084544,\n \"acc_stderr\": 0.028863410665461813,\n \"acc_norm\": 0.745188483421092,\n \"acc_norm_stderr\": 0.029413193311414305,\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5860453215820967,\n \"mc2_stderr\": 0.015209324923113767\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6356655290102389,\n \"acc_stderr\": 0.014063260279882417,\n \"acc_norm\": 0.6638225255972696,\n \"acc_norm_stderr\": 0.013804855026205761\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6546504680342561,\n \"acc_stderr\": 0.0047451035439012934,\n \"acc_norm\": 0.8473411670981876,\n \"acc_norm_stderr\": 0.003589232889306521\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.868421052631579,\n \"acc_stderr\": 0.027508689533549905,\n \"acc_norm\": 0.868421052631579,\n \"acc_norm_stderr\": 0.027508689533549905\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7924528301886793,\n \"acc_stderr\": 0.024959918028911274,\n \"acc_norm\": 0.7924528301886793,\n \"acc_norm_stderr\": 0.024959918028911274\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.03456425745086999,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.03456425745086999\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.04966570903978529,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.04966570903978529\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7617021276595745,\n \"acc_stderr\": 0.027851252973889788,\n \"acc_norm\": 0.7617021276595745,\n \"acc_norm_stderr\": 0.027851252973889788\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7310344827586207,\n \"acc_stderr\": 0.036951833116502325,\n \"acc_norm\": 0.7310344827586207,\n \"acc_norm_stderr\": 0.036951833116502325\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.626984126984127,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\": 0.626984126984127,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8774193548387097,\n \"acc_stderr\": 0.018656720991789406,\n \"acc_norm\": 0.8774193548387097,\n \"acc_norm_stderr\": 0.018656720991789406\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.03465304488406795,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.03465304488406795\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284357,\n \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284357\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.02239078763821677,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02239078763821677\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.013492659751295145,\n \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.013492659751295145\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.020280805062535722,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.020280805062535722\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02995824925008211,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02995824925008211\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8403361344537815,\n \"acc_stderr\": 0.0237933539975288,\n \"acc_norm\": 0.8403361344537815,\n \"acc_norm_stderr\": 0.0237933539975288\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9064220183486239,\n \"acc_stderr\": 0.012486841824601963,\n \"acc_norm\": 0.9064220183486239,\n \"acc_norm_stderr\": 0.012486841824601963\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0321495214780275,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0321495214780275\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316942,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316942\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065515,\n \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065515\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7713004484304933,\n \"acc_stderr\": 0.028188240046929203,\n \"acc_norm\": 0.7713004484304933,\n \"acc_norm_stderr\": 0.028188240046929203\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597453,\n \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597453\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.02728524631275896,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.02728524631275896\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.030381596756651655,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.030381596756651655\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8404907975460123,\n \"acc_stderr\": 0.028767481725983878,\n \"acc_norm\": 0.8404907975460123,\n \"acc_norm_stderr\": 0.028767481725983878\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5892857142857143,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.5892857142857143,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.912621359223301,\n \"acc_stderr\": 0.027960689125970654,\n \"acc_norm\": 0.912621359223301,\n \"acc_norm_stderr\": 0.027960689125970654\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n \"acc_stderr\": 0.01604626163167314,\n \"acc_norm\": 0.9358974358974359,\n \"acc_norm_stderr\": 0.01604626163167314\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9042145593869731,\n \"acc_stderr\": 0.010524031079055826,\n \"acc_norm\": 0.9042145593869731,\n \"acc_norm_stderr\": 0.010524031079055826\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8005780346820809,\n \"acc_stderr\": 0.021511900654252555,\n \"acc_norm\": 0.8005780346820809,\n \"acc_norm_stderr\": 0.021511900654252555\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6189944134078212,\n \"acc_stderr\": 0.016242028834053613,\n \"acc_norm\": 0.6189944134078212,\n \"acc_norm_stderr\": 0.016242028834053613\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8071895424836601,\n \"acc_stderr\": 0.02258931888817669,\n \"acc_norm\": 0.8071895424836601,\n \"acc_norm_stderr\": 0.02258931888817669\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8295819935691319,\n \"acc_stderr\": 0.021355343028264043,\n \"acc_norm\": 0.8295819935691319,\n \"acc_norm_stderr\": 0.021355343028264043\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8302469135802469,\n \"acc_stderr\": 0.02088869041409387,\n \"acc_norm\": 0.8302469135802469,\n \"acc_norm_stderr\": 0.02088869041409387\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6063829787234043,\n \"acc_stderr\": 0.02914454478159616,\n \"acc_norm\": 0.6063829787234043,\n \"acc_norm_stderr\": 0.02914454478159616\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5573663624511083,\n \"acc_stderr\": 0.012685906538206237,\n \"acc_norm\": 0.5573663624511083,\n \"acc_norm_stderr\": 0.012685906538206237\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.023157468308559345,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.023157468308559345\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7924836601307189,\n \"acc_stderr\": 0.016405924270103234,\n \"acc_norm\": 0.7924836601307189,\n \"acc_norm_stderr\": 0.016405924270103234\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8081632653061225,\n \"acc_stderr\": 0.0252069631542254,\n \"acc_norm\": 0.8081632653061225,\n \"acc_norm_stderr\": 0.0252069631542254\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.021166216304659393,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.021166216304659393\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594173,\n \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594173\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.02353755765789256,\n \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.02353755765789256\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5860453215820967,\n \"mc2_stderr\": 0.015209324923113767\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918739\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6482183472327521,\n \"acc_stderr\": 0.013153446023536039\n }\n}\n```", "repo_url": "https://huggingface.co/AA051611/A0110", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|arc:challenge|25_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|gsm8k|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hellaswag|10_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T07-22-47.294306.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["**/details_harness|winogrande|5_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T07-22-47.294306.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T07_22_47.294306", "path": ["results_2024-01-11T07-22-47.294306.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T07-22-47.294306.parquet"]}]}]}
2024-01-11T07:25:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of AA051611/A0110 Dataset automatically created during the evaluation run of model AA051611/A0110 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T07:22:47.294306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of AA051611/A0110\n\n\n\nDataset automatically created during the evaluation run of model AA051611/A0110 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T07:22:47.294306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of AA051611/A0110\n\n\n\nDataset automatically created during the evaluation run of model AA051611/A0110 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T07:22:47.294306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
941b2d4e7059a7858c0922056ab3c1ad63ce3f2e
# Dataset Card for Evaluation run of dfurman/GarrulusMarcoro-7B-v0.1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [dfurman/GarrulusMarcoro-7B-v0.1](https://huggingface.co/dfurman/GarrulusMarcoro-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_dfurman__GarrulusMarcoro-7B-v0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T07:25:38.981116](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__GarrulusMarcoro-7B-v0.1/blob/main/results_2024-01-11T07-25-38.981116.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.65235582933386, "acc_stderr": 0.032031502829239805, "acc_norm": 0.6517234590534214, "acc_norm_stderr": 0.032709809296815245, "mc1": 0.5324357405140759, "mc1_stderr": 0.01746663214957761, "mc2": 0.6705208734039925, "mc2_stderr": 0.0153587467278896 }, "harness|arc:challenge|25": { "acc": 0.6979522184300341, "acc_stderr": 0.013417519144716417, "acc_norm": 0.7235494880546075, "acc_norm_stderr": 0.013069662474252423 }, "harness|hellaswag|10": { "acc": 0.715893248356901, "acc_stderr": 0.004500662294697923, "acc_norm": 0.8800039832702649, "acc_norm_stderr": 0.0032429275808698575 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.041539484047423976, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.041539484047423976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.66, "acc_stderr": 0.04760952285695238, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695238 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.02783491252754406, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.02783491252754406 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.049135952012744975, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.049135952012744975 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768077, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768077 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6, "acc_stderr": 0.03202563076101735, "acc_norm": 0.6, "acc_norm_stderr": 0.03202563076101735 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42857142857142855, "acc_stderr": 0.02548718714785938, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.02548718714785938 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42063492063492064, "acc_stderr": 0.04415438226743744, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.04415438226743744 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7709677419354839, "acc_stderr": 0.02390491431178265, "acc_norm": 0.7709677419354839, "acc_norm_stderr": 0.02390491431178265 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586815, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.020986854593289733, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.020986854593289733 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6641025641025641, "acc_stderr": 0.023946724741563973, "acc_norm": 0.6641025641025641, "acc_norm_stderr": 0.023946724741563973 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3592592592592593, "acc_stderr": 0.029252905927251972, "acc_norm": 0.3592592592592593, "acc_norm_stderr": 0.029252905927251972 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.03038835355188679, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.03038835355188679 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.03861557546255169, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.03861557546255169 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8495412844036697, "acc_stderr": 0.015328563932669235, "acc_norm": 0.8495412844036697, "acc_norm_stderr": 0.015328563932669235 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49074074074074076, "acc_stderr": 0.034093869469927006, "acc_norm": 0.49074074074074076, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926917, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926917 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.02595502084162113, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.02595502084162113 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.695067264573991, "acc_stderr": 0.030898610882477515, "acc_norm": 0.695067264573991, "acc_norm_stderr": 0.030898610882477515 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.034981493854624714, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.034981493854624714 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8888888888888888, "acc_stderr": 0.020588491316092375, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.020588491316092375 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8288633461047255, "acc_stderr": 0.013468201614066306, "acc_norm": 0.8288633461047255, "acc_norm_stderr": 0.013468201614066306 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7485549132947977, "acc_stderr": 0.02335736578587403, "acc_norm": 0.7485549132947977, "acc_norm_stderr": 0.02335736578587403 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.43910614525139663, "acc_stderr": 0.01659802212058043, "acc_norm": 0.43910614525139663, "acc_norm_stderr": 0.01659802212058043 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7254901960784313, "acc_stderr": 0.025553169991826524, "acc_norm": 0.7254901960784313, "acc_norm_stderr": 0.025553169991826524 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.025583062489984813, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.025583062489984813 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7376543209876543, "acc_stderr": 0.02447722285613511, "acc_norm": 0.7376543209876543, "acc_norm_stderr": 0.02447722285613511 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46740547588005216, "acc_stderr": 0.01274307294265335, "acc_norm": 0.46740547588005216, "acc_norm_stderr": 0.01274307294265335 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6654411764705882, "acc_stderr": 0.0286619962023353, "acc_norm": 0.6654411764705882, "acc_norm_stderr": 0.0286619962023353 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.018975427920507208, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.018975427920507208 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.0282638899437846, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.0282638899437846 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774708, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5324357405140759, "mc1_stderr": 0.01746663214957761, "mc2": 0.6705208734039925, "mc2_stderr": 0.0153587467278896 }, "harness|winogrande|5": { "acc": 0.8721389108129439, "acc_stderr": 0.009385235583937267 }, "harness|gsm8k|5": { "acc": 0.6595905989385898, "acc_stderr": 0.013052097103299097 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_dfurman__GarrulusMarcoro-7B-v0.1
[ "region:us" ]
2024-01-11T07:27:58+00:00
{"pretty_name": "Evaluation run of dfurman/GarrulusMarcoro-7B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [dfurman/GarrulusMarcoro-7B-v0.1](https://huggingface.co/dfurman/GarrulusMarcoro-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dfurman__GarrulusMarcoro-7B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T07:25:38.981116](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__GarrulusMarcoro-7B-v0.1/blob/main/results_2024-01-11T07-25-38.981116.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.65235582933386,\n \"acc_stderr\": 0.032031502829239805,\n \"acc_norm\": 0.6517234590534214,\n \"acc_norm_stderr\": 0.032709809296815245,\n \"mc1\": 0.5324357405140759,\n \"mc1_stderr\": 0.01746663214957761,\n \"mc2\": 0.6705208734039925,\n \"mc2_stderr\": 0.0153587467278896\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6979522184300341,\n \"acc_stderr\": 0.013417519144716417,\n \"acc_norm\": 0.7235494880546075,\n \"acc_norm_stderr\": 0.013069662474252423\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.715893248356901,\n \"acc_stderr\": 0.004500662294697923,\n \"acc_norm\": 0.8800039832702649,\n \"acc_norm_stderr\": 0.0032429275808698575\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754406,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754406\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066306,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066306\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n \"acc_stderr\": 0.01659802212058043,\n \"acc_norm\": 0.43910614525139663,\n \"acc_norm_stderr\": 0.01659802212058043\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.01274307294265335,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.01274307294265335\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.0286619962023353,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.0286619962023353\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5324357405140759,\n \"mc1_stderr\": 0.01746663214957761,\n \"mc2\": 0.6705208734039925,\n \"mc2_stderr\": 0.0153587467278896\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8721389108129439,\n \"acc_stderr\": 0.009385235583937267\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6595905989385898,\n \"acc_stderr\": 0.013052097103299097\n }\n}\n```", "repo_url": "https://huggingface.co/dfurman/GarrulusMarcoro-7B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|arc:challenge|25_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|gsm8k|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hellaswag|10_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T07-25-38.981116.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["**/details_harness|winogrande|5_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T07-25-38.981116.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T07_25_38.981116", "path": ["results_2024-01-11T07-25-38.981116.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T07-25-38.981116.parquet"]}]}]}
2024-01-11T07:28:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of dfurman/GarrulusMarcoro-7B-v0.1 Dataset automatically created during the evaluation run of model dfurman/GarrulusMarcoro-7B-v0.1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T07:25:38.981116(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of dfurman/GarrulusMarcoro-7B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model dfurman/GarrulusMarcoro-7B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T07:25:38.981116(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of dfurman/GarrulusMarcoro-7B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model dfurman/GarrulusMarcoro-7B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T07:25:38.981116(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
af8fed707853156f3ec1786c3bd927dc641f05fc
# Dataset of qiubai/チューバイ/仇白 (Arknights) This is the dataset of qiubai/チューバイ/仇白 (Arknights), containing 41 images and their tags. The core tags of this character are `long_hair, red_eyes, black_hair, breasts, bangs, horns, very_long_hair, earrings, grey_hair, large_breasts, hair_between_eyes, multicolored_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 41 | 81.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qiubai_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 41 | 38.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qiubai_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 102 | 84.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qiubai_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 41 | 67.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qiubai_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 102 | 126.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qiubai_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/qiubai_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, black_gloves, black_dress, elbow_gloves, bare_shoulders, holding_sword, looking_at_viewer, belt, closed_mouth, jewelry, cowboy_shot, pointy_ears, thigh_strap, thighhighs | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | black_gloves | black_dress | elbow_gloves | bare_shoulders | holding_sword | looking_at_viewer | belt | closed_mouth | jewelry | cowboy_shot | pointy_ears | thigh_strap | thighhighs | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------------|:---------------|:-----------------|:----------------|:--------------------|:-------|:---------------|:----------|:--------------|:--------------|:--------------|:-------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/qiubai_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T07:29:02+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T07:38:00+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of qiubai/チューバイ/仇白 (Arknights) ====================================== This is the dataset of qiubai/チューバイ/仇白 (Arknights), containing 41 images and their tags. The core tags of this character are 'long\_hair, red\_eyes, black\_hair, breasts, bangs, horns, very\_long\_hair, earrings, grey\_hair, large\_breasts, hair\_between\_eyes, multicolored\_hair', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
44df1f3cb06a3d6129ac75331dd77def58f502a1
# Dataset of lancet-2/Lancet-2/Lancet-2 (Arknights) This is the dataset of lancet-2/Lancet-2/Lancet-2 (Arknights), containing 22 images and their tags. The core tags of this character are `pointy_ears, short_hair, bangs, black_hair, tail, goggles_on_head, snake_tail, blue_eyes, breasts, hair_ornament, hair_flower, medium_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 22 | 37.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lancet_2_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 22 | 16.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lancet_2_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 55 | 38.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lancet_2_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 22 | 31.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lancet_2_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 55 | 58.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lancet_2_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/lancet_2_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bandeau, bare_shoulders, solo, thigh_strap, thighs, tube_top, black_gloves, blue_hairband, looking_at_viewer, black_panties, cleavage, holding, midriff, simple_background, white_background, ass, cowboy_shot, goggles, navel, open_mouth, smile, stomach, torn_clothes, wrench | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bandeau | bare_shoulders | solo | thigh_strap | thighs | tube_top | black_gloves | blue_hairband | looking_at_viewer | black_panties | cleavage | holding | midriff | simple_background | white_background | ass | cowboy_shot | goggles | navel | open_mouth | smile | stomach | torn_clothes | wrench | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-----------------|:-------|:--------------|:---------|:-----------|:---------------|:----------------|:--------------------|:----------------|:-----------|:----------|:----------|:--------------------|:-------------------|:------|:--------------|:----------|:--------|:-------------|:--------|:----------|:---------------|:---------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/lancet_2_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T07:29:03+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T07:37:31+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of lancet-2/Lancet-2/Lancet-2 (Arknights) ================================================= This is the dataset of lancet-2/Lancet-2/Lancet-2 (Arknights), containing 22 images and their tags. The core tags of this character are 'pointy\_ears, short\_hair, bangs, black\_hair, tail, goggles\_on\_head, snake\_tail, blue\_eyes, breasts, hair\_ornament, hair\_flower, medium\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
95f799655df2ea07b14a69f9d8f9344cfcf7fa22
# Dataset Card for Evaluation run of shitshow123/mistral7b_sft_dpo <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [shitshow123/mistral7b_sft_dpo](https://huggingface.co/shitshow123/mistral7b_sft_dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_shitshow123__mistral7b_sft_dpo", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T07:28:54.566656](https://huggingface.co/datasets/open-llm-leaderboard/details_shitshow123__mistral7b_sft_dpo/blob/main/results_2024-01-11T07-28-54.566656.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.24115205900155798, "acc_stderr": 0.030240327476101683, "acc_norm": 0.24138243110295876, "acc_norm_stderr": 0.031046885606606598, "mc1": 0.2350061199510404, "mc1_stderr": 0.014843061507731608, "mc2": 0.4967512296032591, "mc2_stderr": 0.016399783558395026 }, "harness|arc:challenge|25": { "acc": 0.21075085324232082, "acc_stderr": 0.011918271754852184, "acc_norm": 0.27559726962457337, "acc_norm_stderr": 0.013057169655761838 }, "harness|hellaswag|10": { "acc": 0.25692093208524197, "acc_stderr": 0.004360424536145123, "acc_norm": 0.255327623979287, "acc_norm_stderr": 0.004351540603988566 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.23, "acc_stderr": 0.042295258468165065, "acc_norm": 0.23, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.31851851851851853, "acc_stderr": 0.040247784019771096, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.040247784019771096 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.26973684210526316, "acc_stderr": 0.03611780560284898, "acc_norm": 0.26973684210526316, "acc_norm_stderr": 0.03611780560284898 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.22641509433962265, "acc_stderr": 0.025757559893106748, "acc_norm": 0.22641509433962265, "acc_norm_stderr": 0.025757559893106748 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.17, "acc_stderr": 0.0377525168068637, "acc_norm": 0.17, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.24, "acc_stderr": 0.04292346959909283, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2023121387283237, "acc_stderr": 0.030631145539198816, "acc_norm": 0.2023121387283237, "acc_norm_stderr": 0.030631145539198816 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.19607843137254902, "acc_stderr": 0.03950581861179962, "acc_norm": 0.19607843137254902, "acc_norm_stderr": 0.03950581861179962 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.20425531914893616, "acc_stderr": 0.026355158413349424, "acc_norm": 0.20425531914893616, "acc_norm_stderr": 0.026355158413349424 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2719298245614035, "acc_stderr": 0.04185774424022056, "acc_norm": 0.2719298245614035, "acc_norm_stderr": 0.04185774424022056 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.30344827586206896, "acc_stderr": 0.038312260488503336, "acc_norm": 0.30344827586206896, "acc_norm_stderr": 0.038312260488503336 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2275132275132275, "acc_stderr": 0.021591269407823792, "acc_norm": 0.2275132275132275, "acc_norm_stderr": 0.021591269407823792 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.24603174603174602, "acc_stderr": 0.03852273364924318, "acc_norm": 0.24603174603174602, "acc_norm_stderr": 0.03852273364924318 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.03861229196653694, "acc_norm": 0.18, "acc_norm_stderr": 0.03861229196653694 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.2, "acc_stderr": 0.022755204959542936, "acc_norm": 0.2, "acc_norm_stderr": 0.022755204959542936 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2561576354679803, "acc_stderr": 0.030712730070982592, "acc_norm": 0.2561576354679803, "acc_norm_stderr": 0.030712730070982592 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21212121212121213, "acc_stderr": 0.031922715695483, "acc_norm": 0.21212121212121213, "acc_norm_stderr": 0.031922715695483 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.2222222222222222, "acc_stderr": 0.02962022787479048, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.02962022787479048 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.26424870466321243, "acc_stderr": 0.03182155050916648, "acc_norm": 0.26424870466321243, "acc_norm_stderr": 0.03182155050916648 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2282051282051282, "acc_stderr": 0.02127839386358628, "acc_norm": 0.2282051282051282, "acc_norm_stderr": 0.02127839386358628 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.21851851851851853, "acc_stderr": 0.02519575225182379, "acc_norm": 0.21851851851851853, "acc_norm_stderr": 0.02519575225182379 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.19747899159663865, "acc_stderr": 0.025859164122051463, "acc_norm": 0.19747899159663865, "acc_norm_stderr": 0.025859164122051463 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2251655629139073, "acc_stderr": 0.03410435282008937, "acc_norm": 0.2251655629139073, "acc_norm_stderr": 0.03410435282008937 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.1908256880733945, "acc_stderr": 0.016847676400091105, "acc_norm": 0.1908256880733945, "acc_norm_stderr": 0.016847676400091105 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.18518518518518517, "acc_stderr": 0.026491914727355157, "acc_norm": 0.18518518518518517, "acc_norm_stderr": 0.026491914727355157 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.24509803921568626, "acc_stderr": 0.03019028245350195, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.03019028245350195 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.29957805907172996, "acc_stderr": 0.029818024749753095, "acc_norm": 0.29957805907172996, "acc_norm_stderr": 0.029818024749753095 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.2062780269058296, "acc_stderr": 0.027157150479563824, "acc_norm": 0.2062780269058296, "acc_norm_stderr": 0.027157150479563824 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.19008264462809918, "acc_stderr": 0.03581796951709282, "acc_norm": 0.19008264462809918, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.17592592592592593, "acc_stderr": 0.03680918141673881, "acc_norm": 0.17592592592592593, "acc_norm_stderr": 0.03680918141673881 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.25766871165644173, "acc_stderr": 0.03436150827846917, "acc_norm": 0.25766871165644173, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3392857142857143, "acc_stderr": 0.04493949068613539, "acc_norm": 0.3392857142857143, "acc_norm_stderr": 0.04493949068613539 }, "harness|hendrycksTest-management|5": { "acc": 0.21359223300970873, "acc_stderr": 0.04058042015646033, "acc_norm": 0.21359223300970873, "acc_norm_stderr": 0.04058042015646033 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2777777777777778, "acc_stderr": 0.029343114798094476, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.029343114798094476 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.19, "acc_stderr": 0.03942772444036623, "acc_norm": 0.19, "acc_norm_stderr": 0.03942772444036623 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.25798212005108556, "acc_stderr": 0.01564583018834895, "acc_norm": 0.25798212005108556, "acc_norm_stderr": 0.01564583018834895 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2861271676300578, "acc_stderr": 0.02433214677913413, "acc_norm": 0.2861271676300578, "acc_norm_stderr": 0.02433214677913413 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24692737430167597, "acc_stderr": 0.014422292204808843, "acc_norm": 0.24692737430167597, "acc_norm_stderr": 0.014422292204808843 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.238562091503268, "acc_stderr": 0.02440439492808787, "acc_norm": 0.238562091503268, "acc_norm_stderr": 0.02440439492808787 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2958199356913183, "acc_stderr": 0.025922371788818798, "acc_norm": 0.2958199356913183, "acc_norm_stderr": 0.025922371788818798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2222222222222222, "acc_stderr": 0.023132376234543332, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.023132376234543332 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.24113475177304963, "acc_stderr": 0.025518731049537762, "acc_norm": 0.24113475177304963, "acc_norm_stderr": 0.025518731049537762 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.24902216427640156, "acc_stderr": 0.01104489226404077, "acc_norm": 0.24902216427640156, "acc_norm_stderr": 0.01104489226404077 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.16544117647058823, "acc_stderr": 0.022571771025494763, "acc_norm": 0.16544117647058823, "acc_norm_stderr": 0.022571771025494763 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25163398692810457, "acc_stderr": 0.01755581809132226, "acc_norm": 0.25163398692810457, "acc_norm_stderr": 0.01755581809132226 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.18181818181818182, "acc_stderr": 0.03694284335337801, "acc_norm": 0.18181818181818182, "acc_norm_stderr": 0.03694284335337801 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.19183673469387755, "acc_stderr": 0.0252069631542254, "acc_norm": 0.19183673469387755, "acc_norm_stderr": 0.0252069631542254 }, "harness|hendrycksTest-sociology|5": { "acc": 0.22885572139303484, "acc_stderr": 0.029705284056772436, "acc_norm": 0.22885572139303484, "acc_norm_stderr": 0.029705284056772436 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.24, "acc_stderr": 0.04292346959909282, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909282 }, "harness|hendrycksTest-virology|5": { "acc": 0.21686746987951808, "acc_stderr": 0.03208284450356365, "acc_norm": 0.21686746987951808, "acc_norm_stderr": 0.03208284450356365 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.2982456140350877, "acc_stderr": 0.03508771929824565, "acc_norm": 0.2982456140350877, "acc_norm_stderr": 0.03508771929824565 }, "harness|truthfulqa:mc|0": { "mc1": 0.2350061199510404, "mc1_stderr": 0.014843061507731608, "mc2": 0.4967512296032591, "mc2_stderr": 0.016399783558395026 }, "harness|winogrande|5": { "acc": 0.531965272296764, "acc_stderr": 0.014023739221166384 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_shitshow123__mistral7b_sft_dpo
[ "region:us" ]
2024-01-11T07:31:13+00:00
{"pretty_name": "Evaluation run of shitshow123/mistral7b_sft_dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [shitshow123/mistral7b_sft_dpo](https://huggingface.co/shitshow123/mistral7b_sft_dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shitshow123__mistral7b_sft_dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T07:28:54.566656](https://huggingface.co/datasets/open-llm-leaderboard/details_shitshow123__mistral7b_sft_dpo/blob/main/results_2024-01-11T07-28-54.566656.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24115205900155798,\n \"acc_stderr\": 0.030240327476101683,\n \"acc_norm\": 0.24138243110295876,\n \"acc_norm_stderr\": 0.031046885606606598,\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731608,\n \"mc2\": 0.4967512296032591,\n \"mc2_stderr\": 0.016399783558395026\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.21075085324232082,\n \"acc_stderr\": 0.011918271754852184,\n \"acc_norm\": 0.27559726962457337,\n \"acc_norm_stderr\": 0.013057169655761838\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25692093208524197,\n \"acc_stderr\": 0.004360424536145123,\n \"acc_norm\": 0.255327623979287,\n \"acc_norm_stderr\": 0.004351540603988566\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.26973684210526316,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.26973684210526316,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.22641509433962265,\n \"acc_stderr\": 0.025757559893106748,\n \"acc_norm\": 0.22641509433962265,\n \"acc_norm_stderr\": 0.025757559893106748\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n \"acc_stderr\": 0.030631145539198816,\n \"acc_norm\": 0.2023121387283237,\n \"acc_norm_stderr\": 0.030631145539198816\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179962,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179962\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349424,\n \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349424\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.30344827586206896,\n \"acc_stderr\": 0.038312260488503336,\n \"acc_norm\": 0.30344827586206896,\n \"acc_norm_stderr\": 0.038312260488503336\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2275132275132275,\n \"acc_stderr\": 0.021591269407823792,\n \"acc_norm\": 0.2275132275132275,\n \"acc_norm_stderr\": 0.021591269407823792\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.03852273364924318,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.03852273364924318\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.022755204959542936,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.022755204959542936\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2561576354679803,\n \"acc_stderr\": 0.030712730070982592,\n \"acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.030712730070982592\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.02962022787479048,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02962022787479048\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.03182155050916648,\n \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.03182155050916648\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2282051282051282,\n \"acc_stderr\": 0.02127839386358628,\n \"acc_norm\": 0.2282051282051282,\n \"acc_norm_stderr\": 0.02127839386358628\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.21851851851851853,\n \"acc_stderr\": 0.02519575225182379,\n \"acc_norm\": 0.21851851851851853,\n \"acc_norm_stderr\": 0.02519575225182379\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.19747899159663865,\n \"acc_stderr\": 0.025859164122051463,\n \"acc_norm\": 0.19747899159663865,\n \"acc_norm_stderr\": 0.025859164122051463\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008937,\n \"acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008937\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1908256880733945,\n \"acc_stderr\": 0.016847676400091105,\n \"acc_norm\": 0.1908256880733945,\n \"acc_norm_stderr\": 0.016847676400091105\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.026491914727355157,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.026491914727355157\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.29957805907172996,\n \"acc_stderr\": 0.029818024749753095,\n \"acc_norm\": 0.29957805907172996,\n \"acc_norm_stderr\": 0.029818024749753095\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2062780269058296,\n \"acc_stderr\": 0.027157150479563824,\n \"acc_norm\": 0.2062780269058296,\n \"acc_norm_stderr\": 0.027157150479563824\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.19008264462809918,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.19008264462809918,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.17592592592592593,\n \"acc_stderr\": 0.03680918141673881,\n \"acc_norm\": 0.17592592592592593,\n \"acc_norm_stderr\": 0.03680918141673881\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.04058042015646033,\n \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.04058042015646033\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.029343114798094476,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.029343114798094476\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25798212005108556,\n \"acc_stderr\": 0.01564583018834895,\n \"acc_norm\": 0.25798212005108556,\n \"acc_norm_stderr\": 0.01564583018834895\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2861271676300578,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.2861271676300578,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808843,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808843\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.02440439492808787,\n \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.02440439492808787\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2958199356913183,\n \"acc_stderr\": 0.025922371788818798,\n \"acc_norm\": 0.2958199356913183,\n \"acc_norm_stderr\": 0.025922371788818798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023132376234543332,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023132376234543332\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537762,\n \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537762\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24902216427640156,\n \"acc_stderr\": 0.01104489226404077,\n \"acc_norm\": 0.24902216427640156,\n \"acc_norm_stderr\": 0.01104489226404077\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.16544117647058823,\n \"acc_stderr\": 0.022571771025494763,\n \"acc_norm\": 0.16544117647058823,\n \"acc_norm_stderr\": 0.022571771025494763\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.01755581809132226,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.01755581809132226\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.18181818181818182,\n \"acc_stderr\": 0.03694284335337801,\n \"acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.03694284335337801\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.19183673469387755,\n \"acc_stderr\": 0.0252069631542254,\n \"acc_norm\": 0.19183673469387755,\n \"acc_norm_stderr\": 0.0252069631542254\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n \"acc_stderr\": 0.029705284056772436,\n \"acc_norm\": 0.22885572139303484,\n \"acc_norm_stderr\": 0.029705284056772436\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824565,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824565\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731608,\n \"mc2\": 0.4967512296032591,\n \"mc2_stderr\": 0.016399783558395026\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.531965272296764,\n \"acc_stderr\": 0.014023739221166384\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/shitshow123/mistral7b_sft_dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|arc:challenge|25_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|gsm8k|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hellaswag|10_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T07-28-54.566656.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["**/details_harness|winogrande|5_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T07-28-54.566656.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T07_28_54.566656", "path": ["results_2024-01-11T07-28-54.566656.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T07-28-54.566656.parquet"]}]}]}
2024-01-11T07:31:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of shitshow123/mistral7b_sft_dpo Dataset automatically created during the evaluation run of model shitshow123/mistral7b_sft_dpo on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T07:28:54.566656(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of shitshow123/mistral7b_sft_dpo\n\n\n\nDataset automatically created during the evaluation run of model shitshow123/mistral7b_sft_dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T07:28:54.566656(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of shitshow123/mistral7b_sft_dpo\n\n\n\nDataset automatically created during the evaluation run of model shitshow123/mistral7b_sft_dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T07:28:54.566656(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
e74172f3302a576b8259702f033f154fe2f753a6
# Reveal: A Benchmark for Verifiers of Reasoning Chains ## [Paper: A Chain-of-Thought Is as Strong as Its Weakest Link: A Benchmark for Verifiers of Reasoning Chains](https://arxiv.org/abs/2402.00559) Link: https://arxiv.org/abs/2402.00559 Abstract: Prompting language models to provide step-by-step answers (e.g., "Chain-of-Thought") is the prominent approach for complex reasoning tasks, where more accurate reasoning chains typically improve downstream task performance. Recent literature discusses automatic methods to verify reasoning steps to evaluate and improve their correctness. However, no fine-grained step-level datasets are available to enable thorough evaluation of such verification methods, hindering progress in this direction. We introduce Reveal: *Reasoning Verification Evaluation*, a new dataset to benchmark automatic verifiers of complex Chain-of-Thought reasoning in open-domain question answering settings. Reveal includes comprehensive labels for the relevance, attribution to evidence passages, and logical correctness of each reasoning step in a language model's answer, across a wide variety of datasets and state-of-the-art language models. ### Usage To load the dataset: ```python ! pip install datasets from datasets import load_dataset reveal = load_dataset("google/reveal") reveal_eval = reveal['eval'] # select Reveal-Eval, the evaluation split reveal_open = reveal['open'] # select Reveal-Open, the hard-cases split with low-confidence annotations ``` **Note: The above provides a table from `eval/reveal_eval.csv` for easily working at scale with the data. There is another file `eval/reveal_eval.json` with a more intuitive json structure, if you prefer this format.** Some examples of how to handle the data by deriving step-level tasks: ```python import pandas as pd reveal_eval = pd.DataFrame(reveal_eval) # Step Attribution task eval_attr = reveal_eval[~reveal_eval.evidence.isna()].reset_index(drop=True) eval_attr['decontextualized_step'] = eval_attr['decontextualized_step'].fillna(eval_attr['step']) # Fields: # Premise: [evidence] # Hypothesis: [decontextualized_step] # Gold label: [attribution_label] # Step Logic task def _make_history(row): return row['question'] + ' ' + row['full_answer'].split(row['step'].strip())[0] eval_logic = reveal_eval.drop_duplicates(subset=['answer_id', 'step_idx']).reset_index(drop=True) eval_logic = eval_logic[(eval_logic['type_label'] == 'Logical step.') & (eval_logic['logic_relevance_label'] == 'Relevant') & (~eval_logic['correctness_label'].isna())] eval_logic['history'] = eval_logic.apply(_make_history, axis=1) # Fields: # Premise: [history] # Hypothesis: [step] # Gold label: [correctness_label] # Step Relevance task eval_relevance = reveal_eval.drop_duplicates(subset=['answer_id', 'step_idx']).reset_index(drop=True) eval_relevance['relevance_label'] = (eval_relevance['logic_relevance_label'] == 'Relevant') | (eval_relevance['attribution_relevance_label'] == 'Yes') # Fields: # Question: [question] # Answer: [full_answer] # Step: [step] # Gold label: [relevance_label] # Step Type task eval_type = reveal_eval.drop_duplicates(subset=['answer_id', 'step_idx']).reset_index(drop=True) # Fields: # Question: [question] # Answer: [full_answer] # Step: [step] # Gold label: [type_label] # CoT Full Correctness task # Get a list of the final rated evidence passages for each answer_id and concatenate the list into one string: rated_evidence_per_answer = { answer_id: reveal_eval[(reveal_eval.answer_id == answer_id) & reveal_eval.is_final_rated_evidence_for_step]['evidence'] for answer_id in reveal_eval['answer_id'].unique() } rated_evidence_per_answer = { k: '\n'.join([f'Evidence {i+1}: {e}' for i, e in enumerate(v)]) for k, v in rated_evidence_per_answer.items() } # Prepare the eval DataFrame: answer_correctness_eval = reveal_eval.drop_duplicates(subset=['answer_id']).reset_index(drop=True) answer_correctness_eval['all_rated_evidence'] = answer_correctness_eval['answer_id'].apply(lambda x: rated_evidence_per_answer[x]) answer_correctness_eval = answer_correctness_eval[['answer_id','question','full_answer','all_rated_evidence','answer_is_fully_attributable','answer_is_logically_correct','answer_is_fully_attributable_and_correct']] ``` ### **This is an evaluation benchmark. It should not be included in training data for NLP models.** Please do not redistribute any part of the dataset without sufficient protection against web-crawlers. An identifier 64-character string is added to each instance in the dataset to assist in future detection of contamination in web-crawl corporta. The reveal dataset's string is: `Reveal:Mn12GAs2I3S0eWjbTUFC0Y51ijGFB7rGBLnzGGhCQ7OtJPfVg7e6qt9zb5RPL36U` The same has been done to the few-shot prompting demonstrations, to detect whether these demonstrations have been in a model's training data (if so, these demonstrations should not be used for few-shot evaluation of that model). The few-shot demonstrations' string is: `Reveal:HlyeWxw8BRcQ2dPGShTUUjn03uULZOyeNbzKzRIg4QihZ45k1lrye46OoUzi3kkW` #### Fields and Descriptions * **dataset**: Source dataset * **question_id**: ID of the original instance from the source dataset * **question**: The question text * **answer_model**: Model which generated the CoT answer * **answer_id**: ID of a particular model's answer to a question (question_id + answer_model) * **step_idx**: Step index in the answer for this row * **full_answer**: Full CoT answer generated by the model * **step**: The step from the full CoT answer which matches "step_idx", the subject of the row * **decontextualized_step**: The decontextualized version of the step that we used for evidence retrieval (and for the NLI classification evaluations settings) * **attribution_relevance_label**: Majority label for the relevance annotations in the attribution task * **attribution_relevance_majority**: Max # of raters which agreed with each other for this rating * **attribution_relevance_annotations**: The annotations for each rater (ordered list) * **attribution_relevance_raters**: The raters (ordered list) * **attribution_relevance_num_ratings**: The number of raters/ratings * **evidence_id**: The evidence id (from 1 to 3) used for the annotation in this row * **evidence**: The evidence used for the annotation in this row * **attribution_label**: The majority label for whether the evidence supports the step * **attribution_majority**: Max # of raters which agreed with each other for this rating * **attribution_annotations**: The annotations for each rater (ordered list) * **attribution_raters**: The raters (ordered list) * **attribution_num_ratings**: The number of raters/ratings * **attribution_justifications**: The justifications of each rater (ordered list) - note that the raters gave one justification for every step, *not* for every evidence * **annotated_in_attribution_batch**: Which batch this was annotated in (we had 5 annotation batches) * **type_label**: Majority label for whether the step is an attribution step, logical step or both * **type_majority**: Max # of raters which agreed with each other for this rating * **type_annotations**: The annotations for each rater (ordered list) * **type_raters**: The raters (ordered list) * **type_num_ratings**: The number of raters/ratings * **logic_relevance_label**: Majority label for relevance annotations in the logic task * **logic_relevance_majority**: Max # of raters which agreed with each other for this rating * **logic_relevance_annotations**: The annotations for each rater (ordered list) * **logic_relevance_raters**: The raters (ordered list) * **logic_relevance_num_ratings**: The number of raters/ratings * **logic_justifications**: Justifications of each rater (ordered list) - note that the raters gave one justification to all ratings of every step (i.e., one justification for the ratings of type + relevance + correctness together) * **annotated_in_logic_batch**: Which batch this was annotated in (we had 5 annotation batches) * **correctness_label**: Majority label for whether the step is logically correct given the question + previous steps * **correctness_majority**: Max # of raters which agreed with each other for this rating * **correctness_annotations**: The annotations for each rater (ordered list) * **correctness_raters**: The raters (ordered list) * **correctness_num_ratings**: The number of raters/ratings * **agreement_majority_all_steps**: Minimum agreement majority across the attribution and logic ratings for all steps * **is_low_agreement_hard_case**: agreement_majority_all_steps <= 2. This boolean indicates whether the annotations for this answer contain a step with non-trustworthy annotations. This is the difference between Reveal-Eval and Reveal-Open. * **contamination_identifier**: An identification string for contamination detection. * **is_final_rated_evidence_for_step**: Whether this step-evidence pair is the final attribution rating for this step (we try 3 evidences, and stop when we find a supporting or contradicting evidence. The rating in this row is the final attribution rating for the ste pacross all evidence passages) * **answer_is_fully_attributable**: Whether all attribution steps in the answer are fully attributable to some evidence * **answer_is_logically_correct**: Whether all logic steps are logically correct * **answer_is_fully_attributable_and_correct**: Whether all steps are correct (fully attributable or logical)
google/reveal
[ "task_categories:text-classification", "task_categories:question-answering", "size_categories:1K<n<10K", "language:en", "license:cc-by-nd-4.0", "arxiv:2402.00559", "region:us" ]
2024-01-11T08:11:10+00:00
{"language": ["en"], "license": "cc-by-nd-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-classification", "question-answering"], "pretty_name": "Reveal", "configs": [{"config_name": "default", "data_files": [{"split": "eval", "path": "eval/reveal_eval.csv"}, {"split": "open", "path": "open/reveal_open.csv"}]}], "extra_gated_prompt": "By clicking \u201cAccess repository\u201d below, you confirm your understanding that this resource is permitted for use as a test set, but not as a training set, and should not be uploaded to the internet where web-crawlers can access it (such as plain-text in github, or in an academic PDF). Please ensure adherence to the terms detailed in the paper. If you are unsure about your specific case, don't hesitate to contact: [email protected]."}
2024-02-08T14:59:56+00:00
[ "2402.00559" ]
[ "en" ]
TAGS #task_categories-text-classification #task_categories-question-answering #size_categories-1K<n<10K #language-English #license-cc-by-nd-4.0 #arxiv-2402.00559 #region-us
# Reveal: A Benchmark for Verifiers of Reasoning Chains ## Paper: A Chain-of-Thought Is as Strong as Its Weakest Link: A Benchmark for Verifiers of Reasoning Chains Link: URL Abstract: Prompting language models to provide step-by-step answers (e.g., "Chain-of-Thought") is the prominent approach for complex reasoning tasks, where more accurate reasoning chains typically improve downstream task performance. Recent literature discusses automatic methods to verify reasoning steps to evaluate and improve their correctness. However, no fine-grained step-level datasets are available to enable thorough evaluation of such verification methods, hindering progress in this direction. We introduce Reveal: *Reasoning Verification Evaluation*, a new dataset to benchmark automatic verifiers of complex Chain-of-Thought reasoning in open-domain question answering settings. Reveal includes comprehensive labels for the relevance, attribution to evidence passages, and logical correctness of each reasoning step in a language model's answer, across a wide variety of datasets and state-of-the-art language models. ### Usage To load the dataset: Note: The above provides a table from 'eval/reveal_eval.csv' for easily working at scale with the data. There is another file 'eval/reveal_eval.json' with a more intuitive json structure, if you prefer this format. Some examples of how to handle the data by deriving step-level tasks: ### This is an evaluation benchmark. It should not be included in training data for NLP models. Please do not redistribute any part of the dataset without sufficient protection against web-crawlers. An identifier 64-character string is added to each instance in the dataset to assist in future detection of contamination in web-crawl corporta. The reveal dataset's string is: 'Reveal:Mn12GAs2I3S0eWjbTUFC0Y51ijGFB7rGBLnzGGhCQ7OtJPfVg7e6qt9zb5RPL36U' The same has been done to the few-shot prompting demonstrations, to detect whether these demonstrations have been in a model's training data (if so, these demonstrations should not be used for few-shot evaluation of that model). The few-shot demonstrations' string is: 'Reveal:HlyeWxw8BRcQ2dPGShTUUjn03uULZOyeNbzKzRIg4QihZ45k1lrye46OoUzi3kkW' #### Fields and Descriptions * dataset: Source dataset * question_id: ID of the original instance from the source dataset * question: The question text * answer_model: Model which generated the CoT answer * answer_id: ID of a particular model's answer to a question (question_id + answer_model) * step_idx: Step index in the answer for this row * full_answer: Full CoT answer generated by the model * step: The step from the full CoT answer which matches "step_idx", the subject of the row * decontextualized_step: The decontextualized version of the step that we used for evidence retrieval (and for the NLI classification evaluations settings) * attribution_relevance_label: Majority label for the relevance annotations in the attribution task * attribution_relevance_majority: Max # of raters which agreed with each other for this rating * attribution_relevance_annotations: The annotations for each rater (ordered list) * attribution_relevance_raters: The raters (ordered list) * attribution_relevance_num_ratings: The number of raters/ratings * evidence_id: The evidence id (from 1 to 3) used for the annotation in this row * evidence: The evidence used for the annotation in this row * attribution_label: The majority label for whether the evidence supports the step * attribution_majority: Max # of raters which agreed with each other for this rating * attribution_annotations: The annotations for each rater (ordered list) * attribution_raters: The raters (ordered list) * attribution_num_ratings: The number of raters/ratings * attribution_justifications: The justifications of each rater (ordered list) - note that the raters gave one justification for every step, *not* for every evidence * annotated_in_attribution_batch: Which batch this was annotated in (we had 5 annotation batches) * type_label: Majority label for whether the step is an attribution step, logical step or both * type_majority: Max # of raters which agreed with each other for this rating * type_annotations: The annotations for each rater (ordered list) * type_raters: The raters (ordered list) * type_num_ratings: The number of raters/ratings * logic_relevance_label: Majority label for relevance annotations in the logic task * logic_relevance_majority: Max # of raters which agreed with each other for this rating * logic_relevance_annotations: The annotations for each rater (ordered list) * logic_relevance_raters: The raters (ordered list) * logic_relevance_num_ratings: The number of raters/ratings * logic_justifications: Justifications of each rater (ordered list) - note that the raters gave one justification to all ratings of every step (i.e., one justification for the ratings of type + relevance + correctness together) * annotated_in_logic_batch: Which batch this was annotated in (we had 5 annotation batches) * correctness_label: Majority label for whether the step is logically correct given the question + previous steps * correctness_majority: Max # of raters which agreed with each other for this rating * correctness_annotations: The annotations for each rater (ordered list) * correctness_raters: The raters (ordered list) * correctness_num_ratings: The number of raters/ratings * agreement_majority_all_steps: Minimum agreement majority across the attribution and logic ratings for all steps * is_low_agreement_hard_case: agreement_majority_all_steps <= 2. This boolean indicates whether the annotations for this answer contain a step with non-trustworthy annotations. This is the difference between Reveal-Eval and Reveal-Open. * contamination_identifier: An identification string for contamination detection. * is_final_rated_evidence_for_step: Whether this step-evidence pair is the final attribution rating for this step (we try 3 evidences, and stop when we find a supporting or contradicting evidence. The rating in this row is the final attribution rating for the ste pacross all evidence passages) * answer_is_fully_attributable: Whether all attribution steps in the answer are fully attributable to some evidence * answer_is_logically_correct: Whether all logic steps are logically correct * answer_is_fully_attributable_and_correct: Whether all steps are correct (fully attributable or logical)
[ "# Reveal: A Benchmark for Verifiers of Reasoning Chains", "## Paper: A Chain-of-Thought Is as Strong as Its Weakest Link: A Benchmark for Verifiers of Reasoning Chains\n\nLink: URL \n\nAbstract: \nPrompting language models to provide step-by-step answers (e.g., \"Chain-of-Thought\") is the prominent approach for complex reasoning tasks, where more accurate reasoning chains typically improve downstream task performance.\nRecent literature discusses automatic methods to verify reasoning steps to evaluate and improve their correctness. However, no fine-grained step-level datasets are available to enable thorough evaluation of such verification methods, hindering progress in this direction.\nWe introduce Reveal: *Reasoning Verification Evaluation*, a new dataset to benchmark automatic verifiers of complex Chain-of-Thought reasoning in open-domain question answering settings.\nReveal includes comprehensive labels for the relevance, attribution to evidence passages, and logical correctness of each reasoning step in a language model's answer, across a wide variety of datasets and state-of-the-art language models.", "### Usage\n\nTo load the dataset:\n\n\n\nNote: The above provides a table from 'eval/reveal_eval.csv' for easily working at scale with the data. There is another file 'eval/reveal_eval.json' with a more intuitive json structure, if you prefer this format.\n\nSome examples of how to handle the data by deriving step-level tasks:", "### This is an evaluation benchmark. It should not be included in training data for NLP models.\n\nPlease do not redistribute any part of the dataset without sufficient protection against web-crawlers.\n\nAn identifier 64-character string is added to each instance in the dataset to assist in future detection of contamination in web-crawl corporta. \n\nThe reveal dataset's string is: 'Reveal:Mn12GAs2I3S0eWjbTUFC0Y51ijGFB7rGBLnzGGhCQ7OtJPfVg7e6qt9zb5RPL36U'\n\nThe same has been done to the few-shot prompting demonstrations, to detect whether these demonstrations have been in a model's training data (if so, these demonstrations should not be used for few-shot evaluation of that model).\n\nThe few-shot demonstrations' string is: 'Reveal:HlyeWxw8BRcQ2dPGShTUUjn03uULZOyeNbzKzRIg4QihZ45k1lrye46OoUzi3kkW'", "#### Fields and Descriptions\n\n* dataset:\tSource dataset\n* question_id:\tID of the original instance from the source dataset\n* question:\tThe question text\n* answer_model:\tModel which generated the CoT answer\n* answer_id:\tID of a particular model's answer to a question (question_id + answer_model)\n* step_idx:\tStep index in the answer for this row\n* full_answer:\tFull CoT answer generated by the model\n* step:\tThe step from the full CoT answer which matches \"step_idx\", the subject of the row\n* decontextualized_step:\tThe decontextualized version of the step that we used for evidence retrieval (and for the NLI classification evaluations settings)\n* attribution_relevance_label:\tMajority label for the relevance annotations in the attribution task\n* attribution_relevance_majority:\tMax # of raters which agreed with each other for this rating\n* attribution_relevance_annotations:\tThe annotations for each rater (ordered list)\n* attribution_relevance_raters:\tThe raters (ordered list)\n* attribution_relevance_num_ratings:\tThe number of raters/ratings\n* evidence_id:\tThe evidence id (from 1 to 3) used for the annotation in this row\n* evidence:\tThe evidence used for the annotation in this row\n* attribution_label:\tThe majority label for whether the evidence supports the step\n* attribution_majority:\tMax # of raters which agreed with each other for this rating\n* attribution_annotations:\tThe annotations for each rater (ordered list)\n* attribution_raters:\tThe raters (ordered list)\n* attribution_num_ratings:\tThe number of raters/ratings\n* attribution_justifications:\tThe justifications of each rater (ordered list) - note that the raters gave one justification for every step, *not* for every evidence\n* annotated_in_attribution_batch:\tWhich batch this was annotated in (we had 5 annotation batches)\n* type_label:\tMajority label for whether the step is an attribution step, logical step or both\n* type_majority:\tMax # of raters which agreed with each other for this rating\n* type_annotations:\tThe annotations for each rater (ordered list)\n* type_raters:\tThe raters (ordered list)\n* type_num_ratings:\tThe number of raters/ratings\n* logic_relevance_label:\tMajority label for relevance annotations in the logic task\n* logic_relevance_majority:\tMax # of raters which agreed with each other for this rating\n* logic_relevance_annotations:\tThe annotations for each rater (ordered list)\n* logic_relevance_raters:\tThe raters (ordered list)\n* logic_relevance_num_ratings:\tThe number of raters/ratings\n* logic_justifications:\tJustifications of each rater (ordered list) - note that the raters gave one justification to all ratings of every step (i.e., one justification for the ratings of type + relevance + correctness together) \n* annotated_in_logic_batch:\tWhich batch this was annotated in (we had 5 annotation batches)\n* correctness_label:\tMajority label for whether the step is logically correct given the question + previous steps\n* correctness_majority:\tMax # of raters which agreed with each other for this rating\n* correctness_annotations:\tThe annotations for each rater (ordered list)\n* correctness_raters:\tThe raters (ordered list)\n* correctness_num_ratings:\tThe number of raters/ratings\n* agreement_majority_all_steps:\tMinimum agreement majority across the attribution and logic ratings for all steps\n* is_low_agreement_hard_case:\tagreement_majority_all_steps <= 2. This boolean indicates whether the annotations for this answer contain a step with non-trustworthy annotations. This is the difference between Reveal-Eval and Reveal-Open.\n* contamination_identifier:\tAn identification string for contamination detection.\n* is_final_rated_evidence_for_step: Whether this step-evidence pair is the final attribution rating for this step (we try 3 evidences, and stop when we find a supporting or contradicting evidence. The rating in this row is the final attribution rating for the ste pacross all evidence passages)\n* answer_is_fully_attributable: Whether all attribution steps in the answer are fully attributable to some evidence\n* answer_is_logically_correct: Whether all logic steps are logically correct\n* answer_is_fully_attributable_and_correct: Whether all steps are correct (fully attributable or logical)" ]
[ "TAGS\n#task_categories-text-classification #task_categories-question-answering #size_categories-1K<n<10K #language-English #license-cc-by-nd-4.0 #arxiv-2402.00559 #region-us \n", "# Reveal: A Benchmark for Verifiers of Reasoning Chains", "## Paper: A Chain-of-Thought Is as Strong as Its Weakest Link: A Benchmark for Verifiers of Reasoning Chains\n\nLink: URL \n\nAbstract: \nPrompting language models to provide step-by-step answers (e.g., \"Chain-of-Thought\") is the prominent approach for complex reasoning tasks, where more accurate reasoning chains typically improve downstream task performance.\nRecent literature discusses automatic methods to verify reasoning steps to evaluate and improve their correctness. However, no fine-grained step-level datasets are available to enable thorough evaluation of such verification methods, hindering progress in this direction.\nWe introduce Reveal: *Reasoning Verification Evaluation*, a new dataset to benchmark automatic verifiers of complex Chain-of-Thought reasoning in open-domain question answering settings.\nReveal includes comprehensive labels for the relevance, attribution to evidence passages, and logical correctness of each reasoning step in a language model's answer, across a wide variety of datasets and state-of-the-art language models.", "### Usage\n\nTo load the dataset:\n\n\n\nNote: The above provides a table from 'eval/reveal_eval.csv' for easily working at scale with the data. There is another file 'eval/reveal_eval.json' with a more intuitive json structure, if you prefer this format.\n\nSome examples of how to handle the data by deriving step-level tasks:", "### This is an evaluation benchmark. It should not be included in training data for NLP models.\n\nPlease do not redistribute any part of the dataset without sufficient protection against web-crawlers.\n\nAn identifier 64-character string is added to each instance in the dataset to assist in future detection of contamination in web-crawl corporta. \n\nThe reveal dataset's string is: 'Reveal:Mn12GAs2I3S0eWjbTUFC0Y51ijGFB7rGBLnzGGhCQ7OtJPfVg7e6qt9zb5RPL36U'\n\nThe same has been done to the few-shot prompting demonstrations, to detect whether these demonstrations have been in a model's training data (if so, these demonstrations should not be used for few-shot evaluation of that model).\n\nThe few-shot demonstrations' string is: 'Reveal:HlyeWxw8BRcQ2dPGShTUUjn03uULZOyeNbzKzRIg4QihZ45k1lrye46OoUzi3kkW'", "#### Fields and Descriptions\n\n* dataset:\tSource dataset\n* question_id:\tID of the original instance from the source dataset\n* question:\tThe question text\n* answer_model:\tModel which generated the CoT answer\n* answer_id:\tID of a particular model's answer to a question (question_id + answer_model)\n* step_idx:\tStep index in the answer for this row\n* full_answer:\tFull CoT answer generated by the model\n* step:\tThe step from the full CoT answer which matches \"step_idx\", the subject of the row\n* decontextualized_step:\tThe decontextualized version of the step that we used for evidence retrieval (and for the NLI classification evaluations settings)\n* attribution_relevance_label:\tMajority label for the relevance annotations in the attribution task\n* attribution_relevance_majority:\tMax # of raters which agreed with each other for this rating\n* attribution_relevance_annotations:\tThe annotations for each rater (ordered list)\n* attribution_relevance_raters:\tThe raters (ordered list)\n* attribution_relevance_num_ratings:\tThe number of raters/ratings\n* evidence_id:\tThe evidence id (from 1 to 3) used for the annotation in this row\n* evidence:\tThe evidence used for the annotation in this row\n* attribution_label:\tThe majority label for whether the evidence supports the step\n* attribution_majority:\tMax # of raters which agreed with each other for this rating\n* attribution_annotations:\tThe annotations for each rater (ordered list)\n* attribution_raters:\tThe raters (ordered list)\n* attribution_num_ratings:\tThe number of raters/ratings\n* attribution_justifications:\tThe justifications of each rater (ordered list) - note that the raters gave one justification for every step, *not* for every evidence\n* annotated_in_attribution_batch:\tWhich batch this was annotated in (we had 5 annotation batches)\n* type_label:\tMajority label for whether the step is an attribution step, logical step or both\n* type_majority:\tMax # of raters which agreed with each other for this rating\n* type_annotations:\tThe annotations for each rater (ordered list)\n* type_raters:\tThe raters (ordered list)\n* type_num_ratings:\tThe number of raters/ratings\n* logic_relevance_label:\tMajority label for relevance annotations in the logic task\n* logic_relevance_majority:\tMax # of raters which agreed with each other for this rating\n* logic_relevance_annotations:\tThe annotations for each rater (ordered list)\n* logic_relevance_raters:\tThe raters (ordered list)\n* logic_relevance_num_ratings:\tThe number of raters/ratings\n* logic_justifications:\tJustifications of each rater (ordered list) - note that the raters gave one justification to all ratings of every step (i.e., one justification for the ratings of type + relevance + correctness together) \n* annotated_in_logic_batch:\tWhich batch this was annotated in (we had 5 annotation batches)\n* correctness_label:\tMajority label for whether the step is logically correct given the question + previous steps\n* correctness_majority:\tMax # of raters which agreed with each other for this rating\n* correctness_annotations:\tThe annotations for each rater (ordered list)\n* correctness_raters:\tThe raters (ordered list)\n* correctness_num_ratings:\tThe number of raters/ratings\n* agreement_majority_all_steps:\tMinimum agreement majority across the attribution and logic ratings for all steps\n* is_low_agreement_hard_case:\tagreement_majority_all_steps <= 2. This boolean indicates whether the annotations for this answer contain a step with non-trustworthy annotations. This is the difference between Reveal-Eval and Reveal-Open.\n* contamination_identifier:\tAn identification string for contamination detection.\n* is_final_rated_evidence_for_step: Whether this step-evidence pair is the final attribution rating for this step (we try 3 evidences, and stop when we find a supporting or contradicting evidence. The rating in this row is the final attribution rating for the ste pacross all evidence passages)\n* answer_is_fully_attributable: Whether all attribution steps in the answer are fully attributable to some evidence\n* answer_is_logically_correct: Whether all logic steps are logically correct\n* answer_is_fully_attributable_and_correct: Whether all steps are correct (fully attributable or logical)" ]
26a9e56c488504d2311a2290896bac66414b46ee
# Dataset Card for "wiki-concept-gen-chatml" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
sam-mosaic/wiki-concept-gen-chatml
[ "region:us" ]
2024-01-11T08:20:50+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2025503.565156988, "num_examples": 3239}, {"name": "test", "num_bytes": 225125.43484301196, "num_examples": 360}], "download_size": 468996, "dataset_size": 2250629.0}}
2024-01-11T08:27:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for "wiki-concept-gen-chatml" More Information needed
[ "# Dataset Card for \"wiki-concept-gen-chatml\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"wiki-concept-gen-chatml\"\n\nMore Information needed" ]
d447e4007c48a9da2faa529a092489f26e657709
# Generated Questions and Answers from the Falcon RefinedWeb Dataset This dataset contains 1k open-domain questions and answers generated using documents from Falcon's [refinedweb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) dataset using GPT-4. You can find more details about this work in the following [blogpost](https://www.pinecone.io/blog/rag-study/). Each row consits of: - **document_id** - an id of a text chunk from the refined web dataset, from which the question was generated. Each id contains the original document index from the refinedweb dataset, and the chunk index in the following format: "${REFINEDWEB_ID}_${CHUNK_INDEX}" - **document_text** - the text of the chunk from which the question was generated. - **generated_question** - the generated question. - **generated_answer** - the corresponding generated answer.
pinecone/refinedweb-generated-questions
[ "task_categories:question-answering", "size_categories:1K<n<10K", "language:en", "license:mit", "region:us" ]
2024-01-11T08:34:51+00:00
{"language": ["en"], "license": "mit", "size_categories": ["1K<n<10K"], "task_categories": ["question-answering"]}
2024-01-18T11:06:32+00:00
[]
[ "en" ]
TAGS #task_categories-question-answering #size_categories-1K<n<10K #language-English #license-mit #region-us
# Generated Questions and Answers from the Falcon RefinedWeb Dataset This dataset contains 1k open-domain questions and answers generated using documents from Falcon's refinedweb dataset using GPT-4. You can find more details about this work in the following blogpost. Each row consits of: - document_id - an id of a text chunk from the refined web dataset, from which the question was generated. Each id contains the original document index from the refinedweb dataset, and the chunk index in the following format: "${REFINEDWEB_ID}_${CHUNK_INDEX}" - document_text - the text of the chunk from which the question was generated. - generated_question - the generated question. - generated_answer - the corresponding generated answer.
[ "# Generated Questions and Answers from the Falcon RefinedWeb Dataset\n\nThis dataset contains 1k open-domain questions and answers generated using documents from Falcon's refinedweb dataset using GPT-4. You can find more details about this work in the following blogpost.\n\nEach row consits of:\n\n- document_id - an id of a text chunk from the refined web dataset, from which the question was generated. Each id contains the original document index from the refinedweb dataset, and the chunk index in the following format: \"${REFINEDWEB_ID}_${CHUNK_INDEX}\"\n- document_text - the text of the chunk from which the question was generated.\n- generated_question - the generated question.\n- generated_answer - the corresponding generated answer." ]
[ "TAGS\n#task_categories-question-answering #size_categories-1K<n<10K #language-English #license-mit #region-us \n", "# Generated Questions and Answers from the Falcon RefinedWeb Dataset\n\nThis dataset contains 1k open-domain questions and answers generated using documents from Falcon's refinedweb dataset using GPT-4. You can find more details about this work in the following blogpost.\n\nEach row consits of:\n\n- document_id - an id of a text chunk from the refined web dataset, from which the question was generated. Each id contains the original document index from the refinedweb dataset, and the chunk index in the following format: \"${REFINEDWEB_ID}_${CHUNK_INDEX}\"\n- document_text - the text of the chunk from which the question was generated.\n- generated_question - the generated question.\n- generated_answer - the corresponding generated answer." ]
624d631c1eb9eb93115e6781b57ea872648b7587
# Dataset of wild_mane/ワイルドメイン/野鬃 (Arknights) This is the dataset of wild_mane/ワイルドメイン/野鬃 (Arknights), containing 13 images and their tags. The core tags of this character are `long_hair, animal_ears, grey_hair, yellow_eyes, horse_ears, bangs, breasts, feather_hair, tail`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 13 | 17.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wild_mane_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 13 | 13.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wild_mane_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 23 | 22.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wild_mane_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 13 | 16.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wild_mane_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 23 | 26.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wild_mane_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/wild_mane_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | closed_mouth, 1girl, looking_at_viewer, armored_boots, gauntlets, visor_(armor), weapon, standing, black_gloves, black_shorts, cape, high-waist_shorts, holding, multiple_girls, solo, thigh_strap, white_shirt | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | closed_mouth | 1girl | looking_at_viewer | armored_boots | gauntlets | visor_(armor) | weapon | standing | black_gloves | black_shorts | cape | high-waist_shorts | holding | multiple_girls | solo | thigh_strap | white_shirt | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------|:--------|:--------------------|:----------------|:------------|:----------------|:---------|:-----------|:---------------|:---------------|:-------|:--------------------|:----------|:-----------------|:-------|:--------------|:--------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/wild_mane_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T08:37:38+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T08:44:07+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of wild\_mane/ワイルドメイン/野鬃 (Arknights) ============================================ This is the dataset of wild\_mane/ワイルドメイン/野鬃 (Arknights), containing 13 images and their tags. The core tags of this character are 'long\_hair, animal\_ears, grey\_hair, yellow\_eyes, horse\_ears, bangs, breasts, feather\_hair, tail', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
dcc9adad45f32beac7e0ee185d193d55c3fa68d3
# Dataset Card for "alpaca_cleaned_subset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jan-hq/alpaca_cleaned_subset
[ "region:us" ]
2024-01-11T08:45:27+00:00
{"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1207894.5539412673, "num_examples": 1552}], "download_size": 739719, "dataset_size": 1207894.5539412673}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-11T08:45:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for "alpaca_cleaned_subset" More Information needed
[ "# Dataset Card for \"alpaca_cleaned_subset\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"alpaca_cleaned_subset\"\n\nMore Information needed" ]
e206a095404184787ef7a94b04f65205f141ed89
# Dataset Card for Evaluation run of rishiraj/oswald-7b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [rishiraj/oswald-7b](https://huggingface.co/rishiraj/oswald-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_rishiraj__oswald-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T08:51:34.161186](https://huggingface.co/datasets/open-llm-leaderboard/details_rishiraj__oswald-7b/blob/main/results_2024-01-11T08-51-34.161186.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6563844878215765, "acc_stderr": 0.03172096744574799, "acc_norm": 0.6569381429523545, "acc_norm_stderr": 0.03236913498893308, "mc1": 0.3733170134638923, "mc1_stderr": 0.01693237055757063, "mc2": 0.5407006602948565, "mc2_stderr": 0.015292352537910794 }, "harness|arc:challenge|25": { "acc": 0.6356655290102389, "acc_stderr": 0.014063260279882419, "acc_norm": 0.6638225255972696, "acc_norm_stderr": 0.013804855026205761 }, "harness|hellaswag|10": { "acc": 0.6581358295160327, "acc_stderr": 0.004733649274814508, "acc_norm": 0.851822346146186, "acc_norm_stderr": 0.003545499169558053 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6222222222222222, "acc_stderr": 0.04188307537595852, "acc_norm": 0.6222222222222222, "acc_norm_stderr": 0.04188307537595852 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7018867924528301, "acc_stderr": 0.02815283794249387, "acc_norm": 0.7018867924528301, "acc_norm_stderr": 0.02815283794249387 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.036146654241808254, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.036146654241808254 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932261, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932261 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.03223276266711712, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5175438596491229, "acc_stderr": 0.04700708033551038, "acc_norm": 0.5175438596491229, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555498, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555498 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4417989417989418, "acc_stderr": 0.025576257061253837, "acc_norm": 0.4417989417989418, "acc_norm_stderr": 0.025576257061253837 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.47619047619047616, "acc_stderr": 0.04467062628403273, "acc_norm": 0.47619047619047616, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8032258064516129, "acc_stderr": 0.022616409420742025, "acc_norm": 0.8032258064516129, "acc_norm_stderr": 0.022616409420742025 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.793939393939394, "acc_stderr": 0.0315841532404771, "acc_norm": 0.793939393939394, "acc_norm_stderr": 0.0315841532404771 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267042, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267042 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9222797927461139, "acc_stderr": 0.019321805557223144, "acc_norm": 0.9222797927461139, "acc_norm_stderr": 0.019321805557223144 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6871794871794872, "acc_stderr": 0.023507579020645358, "acc_norm": 0.6871794871794872, "acc_norm_stderr": 0.023507579020645358 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34814814814814815, "acc_stderr": 0.029045600290616255, "acc_norm": 0.34814814814814815, "acc_norm_stderr": 0.029045600290616255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6932773109243697, "acc_stderr": 0.02995382389188704, "acc_norm": 0.6932773109243697, "acc_norm_stderr": 0.02995382389188704 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.038227469376587525, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.038227469376587525 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8587155963302753, "acc_stderr": 0.014933868987028075, "acc_norm": 0.8587155963302753, "acc_norm_stderr": 0.014933868987028075 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.03406315360711507, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926917, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926917 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8185654008438819, "acc_stderr": 0.025085961144579654, "acc_norm": 0.8185654008438819, "acc_norm_stderr": 0.025085961144579654 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7040358744394619, "acc_stderr": 0.030636591348699813, "acc_norm": 0.7040358744394619, "acc_norm_stderr": 0.030636591348699813 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.035477710041594654, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.035477710041594654 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7592592592592593, "acc_stderr": 0.04133119440243839, "acc_norm": 0.7592592592592593, "acc_norm_stderr": 0.04133119440243839 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7914110429447853, "acc_stderr": 0.031921934489347235, "acc_norm": 0.7914110429447853, "acc_norm_stderr": 0.031921934489347235 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5, "acc_stderr": 0.04745789978762494, "acc_norm": 0.5, "acc_norm_stderr": 0.04745789978762494 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.039891398595317706, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.039891398595317706 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8974358974358975, "acc_stderr": 0.01987565502786744, "acc_norm": 0.8974358974358975, "acc_norm_stderr": 0.01987565502786744 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8339719029374202, "acc_stderr": 0.013306478243066302, "acc_norm": 0.8339719029374202, "acc_norm_stderr": 0.013306478243066302 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7543352601156069, "acc_stderr": 0.02317629820399201, "acc_norm": 0.7543352601156069, "acc_norm_stderr": 0.02317629820399201 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.264804469273743, "acc_stderr": 0.01475690648326066, "acc_norm": 0.264804469273743, "acc_norm_stderr": 0.01475690648326066 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7777777777777778, "acc_stderr": 0.023805186524888135, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.023805186524888135 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.02567025924218893, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.02567025924218893 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600713002, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600713002 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5, "acc_stderr": 0.029827499313594685, "acc_norm": 0.5, "acc_norm_stderr": 0.029827499313594685 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4869621903520209, "acc_stderr": 0.012765893883835332, "acc_norm": 0.4869621903520209, "acc_norm_stderr": 0.012765893883835332 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7279411764705882, "acc_stderr": 0.027033041151681456, "acc_norm": 0.7279411764705882, "acc_norm_stderr": 0.027033041151681456 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6862745098039216, "acc_stderr": 0.018771683893528183, "acc_norm": 0.6862745098039216, "acc_norm_stderr": 0.018771683893528183 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7510204081632653, "acc_stderr": 0.027682979522960238, "acc_norm": 0.7510204081632653, "acc_norm_stderr": 0.027682979522960238 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.025538433368578337, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.025538433368578337 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.3733170134638923, "mc1_stderr": 0.01693237055757063, "mc2": 0.5407006602948565, "mc2_stderr": 0.015292352537910794 }, "harness|winogrande|5": { "acc": 0.8089976322020521, "acc_stderr": 0.011047808761510429 }, "harness|gsm8k|5": { "acc": 0.6929492039423806, "acc_stderr": 0.012705685723131707 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_rishiraj__oswald-7b
[ "region:us" ]
2024-01-11T08:53:52+00:00
{"pretty_name": "Evaluation run of rishiraj/oswald-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [rishiraj/oswald-7b](https://huggingface.co/rishiraj/oswald-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rishiraj__oswald-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T08:51:34.161186](https://huggingface.co/datasets/open-llm-leaderboard/details_rishiraj__oswald-7b/blob/main/results_2024-01-11T08-51-34.161186.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6563844878215765,\n \"acc_stderr\": 0.03172096744574799,\n \"acc_norm\": 0.6569381429523545,\n \"acc_norm_stderr\": 0.03236913498893308,\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5407006602948565,\n \"mc2_stderr\": 0.015292352537910794\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6356655290102389,\n \"acc_stderr\": 0.014063260279882419,\n \"acc_norm\": 0.6638225255972696,\n \"acc_norm_stderr\": 0.013804855026205761\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6581358295160327,\n \"acc_stderr\": 0.004733649274814508,\n \"acc_norm\": 0.851822346146186,\n \"acc_norm_stderr\": 0.003545499169558053\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4417989417989418,\n \"acc_stderr\": 0.025576257061253837,\n \"acc_norm\": 0.4417989417989418,\n \"acc_norm_stderr\": 0.025576257061253837\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n \"acc_stderr\": 0.022616409420742025,\n \"acc_norm\": 0.8032258064516129,\n \"acc_norm_stderr\": 0.022616409420742025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223144,\n \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223144\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.023507579020645358,\n \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.023507579020645358\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8587155963302753,\n \"acc_stderr\": 0.014933868987028075,\n \"acc_norm\": 0.8587155963302753,\n \"acc_norm_stderr\": 0.014933868987028075\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n \"acc_stderr\": 0.01987565502786744,\n \"acc_norm\": 0.8974358974358975,\n \"acc_norm_stderr\": 0.01987565502786744\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.02317629820399201,\n \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.02317629820399201\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n \"acc_stderr\": 0.01475690648326066,\n \"acc_norm\": 0.264804469273743,\n \"acc_norm_stderr\": 0.01475690648326066\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023805186524888135,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023805186524888135\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713002,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713002\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4869621903520209,\n \"acc_stderr\": 0.012765893883835332,\n \"acc_norm\": 0.4869621903520209,\n \"acc_norm_stderr\": 0.012765893883835332\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7279411764705882,\n \"acc_stderr\": 0.027033041151681456,\n \"acc_norm\": 0.7279411764705882,\n \"acc_norm_stderr\": 0.027033041151681456\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528183,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528183\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5407006602948565,\n \"mc2_stderr\": 0.015292352537910794\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510429\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6929492039423806,\n \"acc_stderr\": 0.012705685723131707\n }\n}\n```", "repo_url": "https://huggingface.co/rishiraj/oswald-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|arc:challenge|25_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|gsm8k|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hellaswag|10_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T08-51-34.161186.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["**/details_harness|winogrande|5_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T08-51-34.161186.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T08_51_34.161186", "path": ["results_2024-01-11T08-51-34.161186.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T08-51-34.161186.parquet"]}]}]}
2024-01-11T08:54:15+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of rishiraj/oswald-7b Dataset automatically created during the evaluation run of model rishiraj/oswald-7b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T08:51:34.161186(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of rishiraj/oswald-7b\n\n\n\nDataset automatically created during the evaluation run of model rishiraj/oswald-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T08:51:34.161186(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of rishiraj/oswald-7b\n\n\n\nDataset automatically created during the evaluation run of model rishiraj/oswald-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T08:51:34.161186(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
99f9117cb8d67f70a506561fdef56f6c779f1b57
# Dataset Card for Evaluation run of kodonho/Solar-M-SakuraSolar-Mixed <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [kodonho/Solar-M-SakuraSolar-Mixed](https://huggingface.co/kodonho/Solar-M-SakuraSolar-Mixed) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_kodonho__Solar-M-SakuraSolar-Mixed", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T09:30:23.797442](https://huggingface.co/datasets/open-llm-leaderboard/details_kodonho__Solar-M-SakuraSolar-Mixed/blob/main/results_2024-01-11T09-30-23.797442.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6286669075578979, "acc_stderr": 0.03182161644520094, "acc_norm": 0.6409086669005702, "acc_norm_stderr": 0.03268280559886575, "mc1": 0.35006119951040393, "mc1_stderr": 0.01669794942015103, "mc2": 0.5961838010107571, "mc2_stderr": 0.016432758071391048 }, "harness|arc:challenge|25": { "acc": 0.44283276450511944, "acc_stderr": 0.014515573873348895, "acc_norm": 0.4590443686006826, "acc_norm_stderr": 0.01456229107360123 }, "harness|hellaswag|10": { "acc": 0.4344752041426011, "acc_stderr": 0.00494674860827134, "acc_norm": 0.5856403106950807, "acc_norm_stderr": 0.004916043838455666 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.04218506215368879, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.04218506215368879 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7171052631578947, "acc_stderr": 0.03665349695640767, "acc_norm": 0.7171052631578947, "acc_norm_stderr": 0.03665349695640767 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.72, "acc_stderr": 0.045126085985421296, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421296 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6641509433962264, "acc_stderr": 0.029067220146644833, "acc_norm": 0.6641509433962264, "acc_norm_stderr": 0.029067220146644833 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.03586879280080341, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.03586879280080341 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6358381502890174, "acc_stderr": 0.03669072477416906, "acc_norm": 0.6358381502890174, "acc_norm_stderr": 0.03669072477416906 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3235294117647059, "acc_stderr": 0.04655010411319616, "acc_norm": 0.3235294117647059, "acc_norm_stderr": 0.04655010411319616 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.04461960433384739, "acc_norm": 0.73, "acc_norm_stderr": 0.04461960433384739 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6340425531914894, "acc_stderr": 0.031489558297455304, "acc_norm": 0.6340425531914894, "acc_norm_stderr": 0.031489558297455304 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.543859649122807, "acc_stderr": 0.04685473041907789, "acc_norm": 0.543859649122807, "acc_norm_stderr": 0.04685473041907789 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5793103448275863, "acc_stderr": 0.0411391498118926, "acc_norm": 0.5793103448275863, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.48677248677248675, "acc_stderr": 0.025742297289575142, "acc_norm": 0.48677248677248675, "acc_norm_stderr": 0.025742297289575142 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.40476190476190477, "acc_stderr": 0.04390259265377561, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.04390259265377561 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8032258064516129, "acc_stderr": 0.022616409420742025, "acc_norm": 0.8032258064516129, "acc_norm_stderr": 0.022616409420742025 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.793939393939394, "acc_stderr": 0.03158415324047711, "acc_norm": 0.793939393939394, "acc_norm_stderr": 0.03158415324047711 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8484848484848485, "acc_stderr": 0.025545650426603634, "acc_norm": 0.8484848484848485, "acc_norm_stderr": 0.025545650426603634 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8860103626943006, "acc_stderr": 0.02293514405391943, "acc_norm": 0.8860103626943006, "acc_norm_stderr": 0.02293514405391943 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6153846153846154, "acc_stderr": 0.02466674491518722, "acc_norm": 0.6153846153846154, "acc_norm_stderr": 0.02466674491518722 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3037037037037037, "acc_stderr": 0.02803792996911499, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.02803792996911499 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6722689075630253, "acc_stderr": 0.030489911417673227, "acc_norm": 0.6722689075630253, "acc_norm_stderr": 0.030489911417673227 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8220183486238533, "acc_stderr": 0.016399436366612893, "acc_norm": 0.8220183486238533, "acc_norm_stderr": 0.016399436366612893 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5324074074074074, "acc_stderr": 0.03402801581358966, "acc_norm": 0.5324074074074074, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8529411764705882, "acc_stderr": 0.024857478080250447, "acc_norm": 0.8529411764705882, "acc_norm_stderr": 0.024857478080250447 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8438818565400844, "acc_stderr": 0.023627159460318667, "acc_norm": 0.8438818565400844, "acc_norm_stderr": 0.023627159460318667 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7040358744394619, "acc_stderr": 0.030636591348699796, "acc_norm": 0.7040358744394619, "acc_norm_stderr": 0.030636591348699796 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7251908396946565, "acc_stderr": 0.03915345408847836, "acc_norm": 0.7251908396946565, "acc_norm_stderr": 0.03915345408847836 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794089, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794089 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.040191074725573483, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.040191074725573483 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7423312883435583, "acc_stderr": 0.03436150827846917, "acc_norm": 0.7423312883435583, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.8349514563106796, "acc_stderr": 0.03675668832233188, "acc_norm": 0.8349514563106796, "acc_norm_stderr": 0.03675668832233188 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.022801382534597528, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.022801382534597528 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.013547415658662255, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.013547415658662255 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7254335260115607, "acc_stderr": 0.02402774515526501, "acc_norm": 0.7254335260115607, "acc_norm_stderr": 0.02402774515526501 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3128491620111732, "acc_stderr": 0.015506892594647267, "acc_norm": 0.3128491620111732, "acc_norm_stderr": 0.015506892594647267 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7124183006535948, "acc_stderr": 0.02591780611714716, "acc_norm": 0.7124183006535948, "acc_norm_stderr": 0.02591780611714716 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.707395498392283, "acc_stderr": 0.02583989833487798, "acc_norm": 0.707395498392283, "acc_norm_stderr": 0.02583989833487798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7716049382716049, "acc_stderr": 0.023358211840626267, "acc_norm": 0.7716049382716049, "acc_norm_stderr": 0.023358211840626267 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5212765957446809, "acc_stderr": 0.029800481645628693, "acc_norm": 0.5212765957446809, "acc_norm_stderr": 0.029800481645628693 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4980443285528031, "acc_stderr": 0.012770138422208631, "acc_norm": 0.4980443285528031, "acc_norm_stderr": 0.012770138422208631 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7132352941176471, "acc_stderr": 0.027472274473233818, "acc_norm": 0.7132352941176471, "acc_norm_stderr": 0.027472274473233818 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6797385620915033, "acc_stderr": 0.01887568293806945, "acc_norm": 0.6797385620915033, "acc_norm_stderr": 0.01887568293806945 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.02812342933514279, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.02812342933514279 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8258706467661692, "acc_stderr": 0.026814951200421603, "acc_norm": 0.8258706467661692, "acc_norm_stderr": 0.026814951200421603 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.89, "acc_stderr": 0.03144660377352203, "acc_norm": 0.89, "acc_norm_stderr": 0.03144660377352203 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685515, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685515 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.783625730994152, "acc_stderr": 0.03158149539338734, "acc_norm": 0.783625730994152, "acc_norm_stderr": 0.03158149539338734 }, "harness|truthfulqa:mc|0": { "mc1": 0.35006119951040393, "mc1_stderr": 0.01669794942015103, "mc2": 0.5961838010107571, "mc2_stderr": 0.016432758071391048 }, "harness|winogrande|5": { "acc": 0.7024467245461721, "acc_stderr": 0.012849085254614647 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_kodonho__Solar-M-SakuraSolar-Mixed
[ "region:us" ]
2024-01-11T09:26:40+00:00
{"pretty_name": "Evaluation run of kodonho/Solar-M-SakuraSolar-Mixed", "dataset_summary": "Dataset automatically created during the evaluation run of model [kodonho/Solar-M-SakuraSolar-Mixed](https://huggingface.co/kodonho/Solar-M-SakuraSolar-Mixed) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kodonho__Solar-M-SakuraSolar-Mixed\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T09:30:23.797442](https://huggingface.co/datasets/open-llm-leaderboard/details_kodonho__Solar-M-SakuraSolar-Mixed/blob/main/results_2024-01-11T09-30-23.797442.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6286669075578979,\n \"acc_stderr\": 0.03182161644520094,\n \"acc_norm\": 0.6409086669005702,\n \"acc_norm_stderr\": 0.03268280559886575,\n \"mc1\": 0.35006119951040393,\n \"mc1_stderr\": 0.01669794942015103,\n \"mc2\": 0.5961838010107571,\n \"mc2_stderr\": 0.016432758071391048\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.44283276450511944,\n \"acc_stderr\": 0.014515573873348895,\n \"acc_norm\": 0.4590443686006826,\n \"acc_norm_stderr\": 0.01456229107360123\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4344752041426011,\n \"acc_stderr\": 0.00494674860827134,\n \"acc_norm\": 0.5856403106950807,\n \"acc_norm_stderr\": 0.004916043838455666\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.029067220146644833,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.029067220146644833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6340425531914894,\n \"acc_stderr\": 0.031489558297455304,\n \"acc_norm\": 0.6340425531914894,\n \"acc_norm_stderr\": 0.031489558297455304\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.543859649122807,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.543859649122807,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.48677248677248675,\n \"acc_stderr\": 0.025742297289575142,\n \"acc_norm\": 0.48677248677248675,\n \"acc_norm_stderr\": 0.025742297289575142\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377561,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377561\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n \"acc_stderr\": 0.022616409420742025,\n \"acc_norm\": 0.8032258064516129,\n \"acc_norm_stderr\": 0.022616409420742025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603634,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603634\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.02293514405391943,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.02293514405391943\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.02466674491518722,\n \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.02466674491518722\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612893,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612893\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8438818565400844,\n \"acc_stderr\": 0.023627159460318667,\n \"acc_norm\": 0.8438818565400844,\n \"acc_norm_stderr\": 0.023627159460318667\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794089,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794089\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.013547415658662255,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.013547415658662255\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526501,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526501\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3128491620111732,\n \"acc_stderr\": 0.015506892594647267,\n \"acc_norm\": 0.3128491620111732,\n \"acc_norm_stderr\": 0.015506892594647267\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7716049382716049,\n \"acc_stderr\": 0.023358211840626267,\n \"acc_norm\": 0.7716049382716049,\n \"acc_norm_stderr\": 0.023358211840626267\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5212765957446809,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.5212765957446809,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4980443285528031,\n \"acc_stderr\": 0.012770138422208631,\n \"acc_norm\": 0.4980443285528031,\n \"acc_norm_stderr\": 0.012770138422208631\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7132352941176471,\n \"acc_stderr\": 0.027472274473233818,\n \"acc_norm\": 0.7132352941176471,\n \"acc_norm_stderr\": 0.027472274473233818\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.01887568293806945,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.01887568293806945\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514279,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514279\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35006119951040393,\n \"mc1_stderr\": 0.01669794942015103,\n \"mc2\": 0.5961838010107571,\n \"mc2_stderr\": 0.016432758071391048\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7024467245461721,\n \"acc_stderr\": 0.012849085254614647\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/kodonho/Solar-M-SakuraSolar-Mixed", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|arc:challenge|25_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|arc:challenge|25_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|gsm8k|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|gsm8k|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hellaswag|10_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hellaswag|10_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T09-24-16.867632.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T09-30-23.797442.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["**/details_harness|winogrande|5_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["**/details_harness|winogrande|5_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T09-30-23.797442.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T09_24_16.867632", "path": ["results_2024-01-11T09-24-16.867632.parquet"]}, {"split": "2024_01_11T09_30_23.797442", "path": ["results_2024-01-11T09-30-23.797442.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T09-30-23.797442.parquet"]}]}]}
2024-01-11T09:32:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of kodonho/Solar-M-SakuraSolar-Mixed Dataset automatically created during the evaluation run of model kodonho/Solar-M-SakuraSolar-Mixed on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T09:30:23.797442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of kodonho/Solar-M-SakuraSolar-Mixed\n\n\n\nDataset automatically created during the evaluation run of model kodonho/Solar-M-SakuraSolar-Mixed on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T09:30:23.797442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of kodonho/Solar-M-SakuraSolar-Mixed\n\n\n\nDataset automatically created during the evaluation run of model kodonho/Solar-M-SakuraSolar-Mixed on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T09:30:23.797442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
aa23a56debdfe50a6498688592521b146cdf88af
# Dataset Card for Evaluation run of decruz07/kellemar-DPO-7B-v1.01 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [decruz07/kellemar-DPO-7B-v1.01](https://huggingface.co/decruz07/kellemar-DPO-7B-v1.01) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_decruz07__kellemar-DPO-7B-v1.01", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T09:27:06.412050](https://huggingface.co/datasets/open-llm-leaderboard/details_decruz07__kellemar-DPO-7B-v1.01/blob/main/results_2024-01-11T09-27-06.412050.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6350266239511185, "acc_stderr": 0.03227049463371384, "acc_norm": 0.6365361433636346, "acc_norm_stderr": 0.032917747977285784, "mc1": 0.38310893512851896, "mc1_stderr": 0.017018461679389855, "mc2": 0.5554335409983038, "mc2_stderr": 0.015361029343436068 }, "harness|arc:challenge|25": { "acc": 0.6245733788395904, "acc_stderr": 0.014150631435111726, "acc_norm": 0.6578498293515358, "acc_norm_stderr": 0.013864152159177278 }, "harness|hellaswag|10": { "acc": 0.6614220274845648, "acc_stderr": 0.004722589460698221, "acc_norm": 0.8504282015534754, "acc_norm_stderr": 0.003559223015610495 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6, "acc_stderr": 0.04232073695151589, "acc_norm": 0.6, "acc_norm_stderr": 0.04232073695151589 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6776315789473685, "acc_stderr": 0.03803510248351585, "acc_norm": 0.6776315789473685, "acc_norm_stderr": 0.03803510248351585 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6792452830188679, "acc_stderr": 0.02872750295788027, "acc_norm": 0.6792452830188679, "acc_norm_stderr": 0.02872750295788027 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.49, "acc_stderr": 0.05024183937956911, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.22, "acc_stderr": 0.041633319989322695, "acc_norm": 0.22, "acc_norm_stderr": 0.041633319989322695 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6011560693641619, "acc_stderr": 0.037336266553835096, "acc_norm": 0.6011560693641619, "acc_norm_stderr": 0.037336266553835096 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04690650298201943, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04690650298201943 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5617021276595745, "acc_stderr": 0.03243618636108101, "acc_norm": 0.5617021276595745, "acc_norm_stderr": 0.03243618636108101 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41798941798941797, "acc_stderr": 0.025402555503260912, "acc_norm": 0.41798941798941797, "acc_norm_stderr": 0.025402555503260912 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.023287665127268552, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.023287665127268552 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5221674876847291, "acc_stderr": 0.03514528562175007, "acc_norm": 0.5221674876847291, "acc_norm_stderr": 0.03514528562175007 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.031922715695483016, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.031922715695483016 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.02962022787479049, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.02962022787479049 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8756476683937824, "acc_stderr": 0.023814477086593552, "acc_norm": 0.8756476683937824, "acc_norm_stderr": 0.023814477086593552 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6128205128205129, "acc_stderr": 0.024697216930878937, "acc_norm": 0.6128205128205129, "acc_norm_stderr": 0.024697216930878937 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.31851851851851853, "acc_stderr": 0.02840653309060846, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.02840653309060846 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6848739495798319, "acc_stderr": 0.030176808288974337, "acc_norm": 0.6848739495798319, "acc_norm_stderr": 0.030176808288974337 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8348623853211009, "acc_stderr": 0.01591955782997604, "acc_norm": 0.8348623853211009, "acc_norm_stderr": 0.01591955782997604 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5, "acc_stderr": 0.034099716973523674, "acc_norm": 0.5, "acc_norm_stderr": 0.034099716973523674 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7941176470588235, "acc_stderr": 0.028379449451588663, "acc_norm": 0.7941176470588235, "acc_norm_stderr": 0.028379449451588663 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290913, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290913 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.695067264573991, "acc_stderr": 0.030898610882477515, "acc_norm": 0.695067264573991, "acc_norm_stderr": 0.030898610882477515 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070416, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070416 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742179, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742179 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5, "acc_stderr": 0.04745789978762494, "acc_norm": 0.5, "acc_norm_stderr": 0.04745789978762494 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8547008547008547, "acc_stderr": 0.023086635086841407, "acc_norm": 0.8547008547008547, "acc_norm_stderr": 0.023086635086841407 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.013547415658662266, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.013547415658662266 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7225433526011561, "acc_stderr": 0.024105712607754307, "acc_norm": 0.7225433526011561, "acc_norm_stderr": 0.024105712607754307 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3329608938547486, "acc_stderr": 0.015761716178397566, "acc_norm": 0.3329608938547486, "acc_norm_stderr": 0.015761716178397566 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7418300653594772, "acc_stderr": 0.02505850331695814, "acc_norm": 0.7418300653594772, "acc_norm_stderr": 0.02505850331695814 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7009646302250804, "acc_stderr": 0.02600330111788514, "acc_norm": 0.7009646302250804, "acc_norm_stderr": 0.02600330111788514 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7530864197530864, "acc_stderr": 0.02399350170904211, "acc_norm": 0.7530864197530864, "acc_norm_stderr": 0.02399350170904211 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5035460992907801, "acc_stderr": 0.02982674915328092, "acc_norm": 0.5035460992907801, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46870925684485004, "acc_stderr": 0.012745204626083131, "acc_norm": 0.46870925684485004, "acc_norm_stderr": 0.012745204626083131 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6691176470588235, "acc_stderr": 0.028582709753898452, "acc_norm": 0.6691176470588235, "acc_norm_stderr": 0.028582709753898452 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6683006535947712, "acc_stderr": 0.01904748523936038, "acc_norm": 0.6683006535947712, "acc_norm_stderr": 0.01904748523936038 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.028666857790274648, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.028666857790274648 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8208955223880597, "acc_stderr": 0.027113286753111837, "acc_norm": 0.8208955223880597, "acc_norm_stderr": 0.027113286753111837 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.572289156626506, "acc_stderr": 0.038515976837185335, "acc_norm": 0.572289156626506, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.38310893512851896, "mc1_stderr": 0.017018461679389855, "mc2": 0.5554335409983038, "mc2_stderr": 0.015361029343436068 }, "harness|winogrande|5": { "acc": 0.7868981846882399, "acc_stderr": 0.011508957690722762 }, "harness|gsm8k|5": { "acc": 0.6163760424564063, "acc_stderr": 0.013394238584938156 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_decruz07__kellemar-DPO-7B-v1.01
[ "region:us" ]
2024-01-11T09:29:23+00:00
{"pretty_name": "Evaluation run of decruz07/kellemar-DPO-7B-v1.01", "dataset_summary": "Dataset automatically created during the evaluation run of model [decruz07/kellemar-DPO-7B-v1.01](https://huggingface.co/decruz07/kellemar-DPO-7B-v1.01) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_decruz07__kellemar-DPO-7B-v1.01\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T09:27:06.412050](https://huggingface.co/datasets/open-llm-leaderboard/details_decruz07__kellemar-DPO-7B-v1.01/blob/main/results_2024-01-11T09-27-06.412050.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6350266239511185,\n \"acc_stderr\": 0.03227049463371384,\n \"acc_norm\": 0.6365361433636346,\n \"acc_norm_stderr\": 0.032917747977285784,\n \"mc1\": 0.38310893512851896,\n \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.5554335409983038,\n \"mc2_stderr\": 0.015361029343436068\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111726,\n \"acc_norm\": 0.6578498293515358,\n \"acc_norm_stderr\": 0.013864152159177278\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6614220274845648,\n \"acc_stderr\": 0.004722589460698221,\n \"acc_norm\": 0.8504282015534754,\n \"acc_norm_stderr\": 0.003559223015610495\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268552,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268552\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593552,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593552\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878937,\n \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878937\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.01591955782997604,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.01591955782997604\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.013547415658662266,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.013547415658662266\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3329608938547486,\n \"acc_stderr\": 0.015761716178397566,\n \"acc_norm\": 0.3329608938547486,\n \"acc_norm_stderr\": 0.015761716178397566\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083131,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083131\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898452,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898452\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38310893512851896,\n \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.5554335409983038,\n \"mc2_stderr\": 0.015361029343436068\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722762\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6163760424564063,\n \"acc_stderr\": 0.013394238584938156\n }\n}\n```", "repo_url": "https://huggingface.co/decruz07/kellemar-DPO-7B-v1.01", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|arc:challenge|25_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|gsm8k|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hellaswag|10_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T09-27-06.412050.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["**/details_harness|winogrande|5_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T09-27-06.412050.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T09_27_06.412050", "path": ["results_2024-01-11T09-27-06.412050.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T09-27-06.412050.parquet"]}]}]}
2024-01-11T09:29:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of decruz07/kellemar-DPO-7B-v1.01 Dataset automatically created during the evaluation run of model decruz07/kellemar-DPO-7B-v1.01 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T09:27:06.412050(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of decruz07/kellemar-DPO-7B-v1.01\n\n\n\nDataset automatically created during the evaluation run of model decruz07/kellemar-DPO-7B-v1.01 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T09:27:06.412050(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of decruz07/kellemar-DPO-7B-v1.01\n\n\n\nDataset automatically created during the evaluation run of model decruz07/kellemar-DPO-7B-v1.01 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T09:27:06.412050(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
3a10cb50e8e47c4279d52a6273686d377d5b121c
# Dataset of u-official/U-Official (Arknights) This is the dataset of u-official/U-Official (Arknights), containing 24 images and their tags. The core tags of this character are `pink_hair, bangs, headphones, hair_between_eyes, hair_ornament, breasts, multicolored_hair, drill_hair, purple_eyes, pink_eyes, animal_ears, large_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 24 | 47.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_official_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 24 | 20.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_official_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 68 | 52.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_official_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 24 | 38.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_official_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 68 | 79.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_official_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/u_official_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------| | 0 | 24 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, blush, shirt, black_choker, sitting, skirt, black_jacket, holding, long_sleeves, necktie, open_mouth | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | shirt | black_choker | sitting | skirt | black_jacket | holding | long_sleeves | necktie | open_mouth | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:--------|:---------------|:----------|:--------|:---------------|:----------|:---------------|:----------|:-------------| | 0 | 24 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/u_official_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T09:30:16+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T09:38:14+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of u-official/U-Official (Arknights) ============================================ This is the dataset of u-official/U-Official (Arknights), containing 24 images and their tags. The core tags of this character are 'pink\_hair, bangs, headphones, hair\_between\_eyes, hair\_ornament, breasts, multicolored\_hair, drill\_hair, purple\_eyes, pink\_eyes, animal\_ears, large\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
e5f79eab67884a2b53366be7005dadebef528e8e
# Dataset of catapult/カタパルト/空爆 (Arknights) This is the dataset of catapult/カタパルト/空爆 (Arknights), containing 10 images and their tags. The core tags of this character are `animal_ears, brown_hair, horse_ears, short_hair, green_eyes, hair_ornament, bangs, breasts, hair_between_eyes, horse_girl, red_hair, tail, hairclip, multicolored_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 10 | 8.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catapult_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 10 | 5.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catapult_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 22 | 11.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catapult_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 10 | 7.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catapult_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 22 | 15.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catapult_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/catapult_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, long_sleeves, looking_at_viewer, smile, black_thighhighs, open_mouth, black_choker, black_shirt, navel, open_jacket, belt, crop_top, midriff, simple_background | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | long_sleeves | looking_at_viewer | smile | black_thighhighs | open_mouth | black_choker | black_shirt | navel | open_jacket | belt | crop_top | midriff | simple_background | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------------------|:--------|:-------------------|:-------------|:---------------|:--------------|:--------|:--------------|:-------|:-----------|:----------|:--------------------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/catapult_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T09:30:19+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T09:35:07+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of catapult/カタパルト/空爆 (Arknights) ======================================== This is the dataset of catapult/カタパルト/空爆 (Arknights), containing 10 images and their tags. The core tags of this character are 'animal\_ears, brown\_hair, horse\_ears, short\_hair, green\_eyes, hair\_ornament, bangs, breasts, hair\_between\_eyes, horse\_girl, red\_hair, tail, hairclip, multicolored\_hair', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
ff3a6ff94cd95aaa1a8a78f722726caa6b3e4606
# Dataset of cement/セメント/洋灰 (Arknights) This is the dataset of cement/セメント/洋灰 (Arknights), containing 19 images and their tags. The core tags of this character are `brown_hair, short_hair, brown_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 19 | 28.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cement_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 19 | 15.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cement_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 47 | 34.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cement_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 19 | 25.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cement_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 47 | 50.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cement_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/cement_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 19 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, open_mouth, helmet, long_sleeves, looking_at_viewer, orange_jacket, white_background, full_body, simple_background, :d, shorts | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | open_mouth | helmet | long_sleeves | looking_at_viewer | orange_jacket | white_background | full_body | simple_background | :d | shorts | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------------|:---------|:---------------|:--------------------|:----------------|:-------------------|:------------|:--------------------|:-----|:---------| | 0 | 19 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/cement_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T09:30:23+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T09:35:02+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of cement/セメント/洋灰 (Arknights) ===================================== This is the dataset of cement/セメント/洋灰 (Arknights), containing 19 images and their tags. The core tags of this character are 'brown\_hair, short\_hair, brown\_eyes', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
6a981e68b1fb1f5f2560dd0453fbcb2040a11eae
# Dataset of firewhistle/ファイヤーホイッスル/火哨 (Arknights) This is the dataset of firewhistle/ファイヤーホイッスル/火哨 (Arknights), containing 20 images and their tags. The core tags of this character are `bangs, breasts, long_hair, hair_ornament, yellow_eyes, brown_hair, hairclip, medium_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 20 | 35.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/firewhistle_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 20 | 17.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/firewhistle_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 56 | 40.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/firewhistle_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 20 | 29.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/firewhistle_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 56 | 61.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/firewhistle_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/firewhistle_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, bare_shoulders, off_shoulder, open_jacket, midriff, long_sleeves, smile, black_pants, navel, black_jacket, black_shorts, crop_top, necklace, simple_background, strapless, open_mouth, shirt, shoes | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | bare_shoulders | off_shoulder | open_jacket | midriff | long_sleeves | smile | black_pants | navel | black_jacket | black_shorts | crop_top | necklace | simple_background | strapless | open_mouth | shirt | shoes | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------------|:---------------|:--------------|:----------|:---------------|:--------|:--------------|:--------|:---------------|:---------------|:-----------|:-----------|:--------------------|:------------|:-------------|:--------|:--------| | 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/firewhistle_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T09:42:21+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T09:46:49+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of firewhistle/ファイヤーホイッスル/火哨 (Arknights) ================================================ This is the dataset of firewhistle/ファイヤーホイッスル/火哨 (Arknights), containing 20 images and their tags. The core tags of this character are 'bangs, breasts, long\_hair, hair\_ornament, yellow\_eyes, brown\_hair, hairclip, medium\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
10fd34ce2c56a13efb9161eec5133100e13c8086
# Dataset of dagda/ダグザ/达格达 (Arknights) This is the dataset of dagda/ダグザ/达格达 (Arknights), containing 40 images and their tags. The core tags of this character are `animal_ears, black_hair, long_hair, yellow_eyes, cat_ears, hair_between_eyes, bangs, tail, very_long_hair, cat_tail, cat_girl, ear_piercing`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 40 | 62.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dagda_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 40 | 30.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dagda_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 93 | 63.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dagda_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 40 | 53.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dagda_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 93 | 99.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dagda_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/dagda_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | long_sleeves, 1girl, black_jacket, black_shirt, looking_at_viewer, solo, thigh_strap, black_shorts, open_jacket, short_shorts, closed_mouth, simple_background, belt, black_choker, piercing, black_footwear, black_gloves, extra_ears, boots, cowboy_shot, white_background, standing | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_choker, black_jacket, black_shirt, looking_at_viewer, upper_body, open_jacket, simple_background, solo, white_background, fang, piercing, closed_mouth, collarbone, cropped_torso, extra_ears, jewelry, open_mouth, slit_pupils | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | long_sleeves | 1girl | black_jacket | black_shirt | looking_at_viewer | solo | thigh_strap | black_shorts | open_jacket | short_shorts | closed_mouth | simple_background | belt | black_choker | piercing | black_footwear | black_gloves | extra_ears | boots | cowboy_shot | white_background | standing | upper_body | fang | collarbone | cropped_torso | jewelry | open_mouth | slit_pupils | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------|:--------|:---------------|:--------------|:--------------------|:-------|:--------------|:---------------|:--------------|:---------------|:---------------|:--------------------|:-------|:---------------|:-----------|:-----------------|:---------------|:-------------|:--------|:--------------|:-------------------|:-----------|:-------------|:-------|:-------------|:----------------|:----------|:-------------|:--------------| | 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | | X | X | X | X | X | | | X | | X | X | | X | X | | | X | | | X | | X | X | X | X | X | X | X |
CyberHarem/dagda_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T09:42:32+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T09:51:37+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of dagda/ダグザ/达格达 (Arknights) ==================================== This is the dataset of dagda/ダグザ/达格达 (Arknights), containing 40 images and their tags. The core tags of this character are 'animal\_ears, black\_hair, long\_hair, yellow\_eyes, cat\_ears, hair\_between\_eyes, bangs, tail, very\_long\_hair, cat\_tail, cat\_girl, ear\_piercing', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
59218e5013d524f1bcfc952b9bb65808367cc515
# PhishingURLDataset This dataset is created for being used for neural network training, on phishing website detection. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details This dataset contains phishing websites, which are labeled with "1" and are called "malignant", and benign websites, which are labeled with "0". ### Dataset Sources - **Kaggle Dataset on Phishing URLs:** https://www.kaggle.com/datasets/siddharthkumar25/malicious-and-benign-urls - **USOM Phishing Websites Dataset:** https://www.usom.gov.tr/url-list.txt - **Phishtank Dataset:** http://data.phishtank.com/data/online-valid.csv
semihGuner2002/PhishingURLsDataset
[ "license:apache-2.0", "region:us" ]
2024-01-11T09:48:27+00:00
{"license": "apache-2.0", "dataset_info": {"features": [{"name": "url", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "0.0", "1": "1.0"}}}}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 38536329.817049906, "num_examples": 642533}, {"name": "test", "num_bytes": 6800578.1829500925, "num_examples": 113389}], "download_size": 32729166, "dataset_size": 45336908.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]}
2024-01-11T10:14:04+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
# PhishingURLDataset This dataset is created for being used for neural network training, on phishing website detection. It has been generated using this raw template. ## Dataset Details This dataset contains phishing websites, which are labeled with "1" and are called "malignant", and benign websites, which are labeled with "0". ### Dataset Sources - Kaggle Dataset on Phishing URLs: URL - USOM Phishing Websites Dataset: URL - Phishtank Dataset: URL
[ "# PhishingURLDataset\n\nThis dataset is created for being used for neural network training, on phishing website detection.\n\nIt has been generated using this raw template.", "## Dataset Details\n\nThis dataset contains phishing websites, which are labeled with \"1\" and are called \"malignant\", and benign websites, which are labeled with \"0\".", "### Dataset Sources \n- Kaggle Dataset on Phishing URLs: URL\n- USOM Phishing Websites Dataset: URL\n- Phishtank Dataset: URL" ]
[ "TAGS\n#license-apache-2.0 #region-us \n", "# PhishingURLDataset\n\nThis dataset is created for being used for neural network training, on phishing website detection.\n\nIt has been generated using this raw template.", "## Dataset Details\n\nThis dataset contains phishing websites, which are labeled with \"1\" and are called \"malignant\", and benign websites, which are labeled with \"0\".", "### Dataset Sources \n- Kaggle Dataset on Phishing URLs: URL\n- USOM Phishing Websites Dataset: URL\n- Phishtank Dataset: URL" ]
1b781af257596859c9a659e2cb1f3d9206e90521
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.1](https://huggingface.co/andysalerno/openchat-nectar-0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_andysalerno__openchat-nectar-0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T09:55:59.577915](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.1/blob/main/results_2024-01-11T09-55-59.577915.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.654396718006494, "acc_stderr": 0.03184182155109172, "acc_norm": 0.6549387923759522, "acc_norm_stderr": 0.03249770872652723, "mc1": 0.3769889840881273, "mc1_stderr": 0.01696551757893035, "mc2": 0.5421624590053248, "mc2_stderr": 0.015360430241150334 }, "harness|arc:challenge|25": { "acc": 0.6254266211604096, "acc_stderr": 0.014144193471893449, "acc_norm": 0.6621160409556314, "acc_norm_stderr": 0.01382204792228351 }, "harness|hellaswag|10": { "acc": 0.6329416450906195, "acc_stderr": 0.004810175357870936, "acc_norm": 0.8299143596893049, "acc_norm_stderr": 0.003749401775087307 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119668, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119668 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.67, "acc_stderr": 0.047258156262526094, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526094 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.028049186315695255, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.028049186315695255 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.0358687928008034, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.0358687928008034 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456344, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456344 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6936416184971098, "acc_stderr": 0.03514942551267438, "acc_norm": 0.6936416184971098, "acc_norm_stderr": 0.03514942551267438 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082635, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082635 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.0446196043338474, "acc_norm": 0.73, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6, "acc_stderr": 0.03202563076101735, "acc_norm": 0.6, "acc_norm_stderr": 0.03202563076101735 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.047028804320496165, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.047028804320496165 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555498, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555498 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4444444444444444, "acc_stderr": 0.025591857761382182, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.025591857761382182 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8096774193548387, "acc_stderr": 0.022331707611823078, "acc_norm": 0.8096774193548387, "acc_norm_stderr": 0.022331707611823078 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.03517603540361008, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.03517603540361008 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.031922715695483016, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.031922715695483016 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267042, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267042 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6641025641025641, "acc_stderr": 0.023946724741563973, "acc_norm": 0.6641025641025641, "acc_norm_stderr": 0.023946724741563973 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.37037037037037035, "acc_stderr": 0.029443169323031537, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.029443169323031537 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6848739495798319, "acc_stderr": 0.030176808288974337, "acc_norm": 0.6848739495798319, "acc_norm_stderr": 0.030176808288974337 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8513761467889909, "acc_stderr": 0.015251253773660834, "acc_norm": 0.8513761467889909, "acc_norm_stderr": 0.015251253773660834 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5185185185185185, "acc_stderr": 0.0340763209385405, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.0340763209385405 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8333333333333334, "acc_stderr": 0.02615686752393104, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.02615686752393104 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8143459915611815, "acc_stderr": 0.025310495376944863, "acc_norm": 0.8143459915611815, "acc_norm_stderr": 0.025310495376944863 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7040358744394619, "acc_stderr": 0.030636591348699813, "acc_norm": 0.7040358744394619, "acc_norm_stderr": 0.030636591348699813 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7557251908396947, "acc_stderr": 0.03768335959728745, "acc_norm": 0.7557251908396947, "acc_norm_stderr": 0.03768335959728745 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8181818181818182, "acc_stderr": 0.03520893951097653, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.03520893951097653 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7592592592592593, "acc_stderr": 0.04133119440243839, "acc_norm": 0.7592592592592593, "acc_norm_stderr": 0.04133119440243839 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8974358974358975, "acc_stderr": 0.01987565502786744, "acc_norm": 0.8974358974358975, "acc_norm_stderr": 0.01987565502786744 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.79, "acc_stderr": 0.04093601807403326, "acc_norm": 0.79, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8326947637292464, "acc_stderr": 0.013347327202920332, "acc_norm": 0.8326947637292464, "acc_norm_stderr": 0.013347327202920332 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7543352601156069, "acc_stderr": 0.023176298203992, "acc_norm": 0.7543352601156069, "acc_norm_stderr": 0.023176298203992 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2636871508379888, "acc_stderr": 0.014736926383761976, "acc_norm": 0.2636871508379888, "acc_norm_stderr": 0.014736926383761976 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.738562091503268, "acc_stderr": 0.025160998214292456, "acc_norm": 0.738562091503268, "acc_norm_stderr": 0.025160998214292456 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.025670259242188936, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.025670259242188936 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7777777777777778, "acc_stderr": 0.023132376234543332, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.023132376234543332 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46808510638297873, "acc_stderr": 0.029766675075873866, "acc_norm": 0.46808510638297873, "acc_norm_stderr": 0.029766675075873866 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.48565840938722293, "acc_stderr": 0.012764981829524269, "acc_norm": 0.48565840938722293, "acc_norm_stderr": 0.012764981829524269 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7352941176470589, "acc_stderr": 0.02679956202488766, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.02679956202488766 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6781045751633987, "acc_stderr": 0.01890101532209309, "acc_norm": 0.6781045751633987, "acc_norm_stderr": 0.01890101532209309 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302505, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302505 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7551020408163265, "acc_stderr": 0.027529637440174937, "acc_norm": 0.7551020408163265, "acc_norm_stderr": 0.027529637440174937 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.02619392354445412, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.02619392354445412 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197768, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197768 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8187134502923976, "acc_stderr": 0.029547741687640038, "acc_norm": 0.8187134502923976, "acc_norm_stderr": 0.029547741687640038 }, "harness|truthfulqa:mc|0": { "mc1": 0.3769889840881273, "mc1_stderr": 0.01696551757893035, "mc2": 0.5421624590053248, "mc2_stderr": 0.015360430241150334 }, "harness|winogrande|5": { "acc": 0.813733228097869, "acc_stderr": 0.01094187795567621 }, "harness|gsm8k|5": { "acc": 0.6967399545109931, "acc_stderr": 0.012661502663418697 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_andysalerno__openchat-nectar-0.1
[ "region:us" ]
2024-01-11T09:58:17+00:00
{"pretty_name": "Evaluation run of andysalerno/openchat-nectar-0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.1](https://huggingface.co/andysalerno/openchat-nectar-0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__openchat-nectar-0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T09:55:59.577915](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.1/blob/main/results_2024-01-11T09-55-59.577915.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.654396718006494,\n \"acc_stderr\": 0.03184182155109172,\n \"acc_norm\": 0.6549387923759522,\n \"acc_norm_stderr\": 0.03249770872652723,\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5421624590053248,\n \"mc2_stderr\": 0.015360430241150334\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.014144193471893449,\n \"acc_norm\": 0.6621160409556314,\n \"acc_norm_stderr\": 0.01382204792228351\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6329416450906195,\n \"acc_stderr\": 0.004810175357870936,\n \"acc_norm\": 0.8299143596893049,\n \"acc_norm_stderr\": 0.003749401775087307\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119668,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119668\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456344,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456344\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.03514942551267438,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.03514942551267438\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382182,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382182\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n \"acc_stderr\": 0.022331707611823078,\n \"acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.022331707611823078\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.029443169323031537,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.029443169323031537\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.0340763209385405,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.0340763209385405\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.02615686752393104,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02615686752393104\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944863,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944863\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728745,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728745\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n \"acc_stderr\": 0.01987565502786744,\n \"acc_norm\": 0.8974358974358975,\n \"acc_norm_stderr\": 0.01987565502786744\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992,\n \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n \"acc_stderr\": 0.014736926383761976,\n \"acc_norm\": 0.2636871508379888,\n \"acc_norm_stderr\": 0.014736926383761976\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023132376234543332,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023132376234543332\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48565840938722293,\n \"acc_stderr\": 0.012764981829524269,\n \"acc_norm\": 0.48565840938722293,\n \"acc_norm_stderr\": 0.012764981829524269\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02679956202488766,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02679956202488766\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174937,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174937\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5421624590053248,\n \"mc2_stderr\": 0.015360430241150334\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.01094187795567621\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6967399545109931,\n \"acc_stderr\": 0.012661502663418697\n }\n}\n```", "repo_url": "https://huggingface.co/andysalerno/openchat-nectar-0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|arc:challenge|25_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|gsm8k|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hellaswag|10_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T09-55-59.577915.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["**/details_harness|winogrande|5_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T09-55-59.577915.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T09_55_59.577915", "path": ["results_2024-01-11T09-55-59.577915.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T09-55-59.577915.parquet"]}]}]}
2024-01-11T09:58:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.1 Dataset automatically created during the evaluation run of model andysalerno/openchat-nectar-0.1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T09:55:59.577915(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.1\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/openchat-nectar-0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T09:55:59.577915(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.1\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/openchat-nectar-0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T09:55:59.577915(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
d7e4823df49732e26762458ec98f6adbeb8b92d1
# EvalCrafter Text-to-Video (ECTV) Dataset 🎥📊 [Code](https://github.com/EvalCrafter/EvalCrafter) · [Project Page](http://evalcrafter.github.io) · [Huggingface Leaderboard](https://huggingface.co/spaces/AILab-CVC/EvalCrafter) · [Paper@ArXiv](https://arxiv.org/abs/2310.11440) · [Prompt list](https://github.com/evalcrafter/EvalCrafter/blob/master/prompt700.txt) Welcome to the ECTV dataset! This repository contains around 10000 videos generated by various methods using the [Prompt list](https://github.com/evalcrafter/EvalCrafter/blob/master/prompt700.txt). These videos have been evaluated using the innovative EvalCrafter framework, which assesses generative models across visual, content, and motion qualities using 17 objective metrics and subjective user opinions. ## Dataset Details 📚 - **Paper:** [Read the Paper](https://arxiv.org/abs/2310.11440) - **Code:** [Code](https://github.com/EvalCrafter/EvalCrafter) - **Prompt List (700 prompts):** [Prompt list](https://github.com/evalcrafter/EvalCrafter/blob/master/prompt700.txt) - **Hugging Face Leaderboard:** [Huggingface Leaderboard](https://huggingface.co/spaces/AILab-CVC/EvalCrafter) - **Project Page:** [Project Page](http://evalcrafter.github.io) - **Methods Included in ECTV dataset:** - [VideoCrafter2](https://github.com/AILab-CVC/VideoCrafter) - [VideoCrafter1](https://github.com/AILab-CVC/VideoCrafter) - [VideoCrafter0.9 (Floor33)](http://floor33.tech/) - [Gen2-2023.12](https://research.runwayml.com/gen2) - [Gen2-2023.09](https://research.runwayml.com/gen2) - [PikaLab V1.0](https://pika.art/) - [PikaLab](https://www.pika.art/) - [Hotshot-XL](https://research.runwayml.com/gen2) - [Show-1](https://research.runwayml.com/gen2) - [Modelscope-XL](https://modelscope.cn/models/damo/Image-to-Video/summary) - [Zeroscope](https://huggingface.co/cerspense) - [Lavie](https://github.com/Vchitect/LaVie) - [MoonValley](https://moonvalley.ai/) - **Dataset Structure:** Generate videos are organized in the following structure (take videocrafter-v1.0 for an example) ``` ./videocrafter-v1.0.tar.gz/videocrafter-v1.0/ ├── 0000.mp4 ├── 0001.mp4 ├── 0002.mp4 ├── 0003.mp4 ├── 0004.mp4 ... └── 0699.mp4 ``` ## Acknowledgements and Citation 🙏 This dataset is based on the EvalCrafter framework, which utilizes various open-source repositories for video generation evaluation. If you find this dataset helpful, please consider citing the original work: ```bash @article{liu2023evalcrafter, title={Evalcrafter: Benchmarking and evaluating large video generation models}, author={Liu, Yaofang and Cun, Xiaodong and Liu, Xuebo and Wang, Xintao and Zhang, Yong and Chen, Haoxin and Liu, Yang and Zeng, Tieyong and Chan, Raymond and Shan, Ying}, journal={arXiv preprint arXiv:2310.11440}, year={2023} } ``` ## Explore More About Video Generation: - [VideoCrafter1: Open Diffusion Models for High-Quality Video Generation](https://github.com/AILab-CVC/VideoCrafter) - [VideoCrafter2: Overcoming Data Limitations for High-Quality Video Diffusion Models](https://github.com/AILab-CVC/VideoCrafter)
RaphaelLiu/EvalCrafter_T2V_Dataset
[ "license:apache-2.0", "arxiv:2310.11440", "region:us" ]
2024-01-11T10:08:25+00:00
{"license": "apache-2.0"}
2024-01-24T13:34:33+00:00
[ "2310.11440" ]
[]
TAGS #license-apache-2.0 #arxiv-2310.11440 #region-us
# EvalCrafter Text-to-Video (ECTV) Dataset Code · Project Page · Huggingface Leaderboard · Paper@ArXiv · Prompt list Welcome to the ECTV dataset! This repository contains around 10000 videos generated by various methods using the Prompt list. These videos have been evaluated using the innovative EvalCrafter framework, which assesses generative models across visual, content, and motion qualities using 17 objective metrics and subjective user opinions. ## Dataset Details - Paper: Read the Paper - Code: Code - Prompt List (700 prompts): Prompt list - Hugging Face Leaderboard: Huggingface Leaderboard - Project Page: Project Page - Methods Included in ECTV dataset: - VideoCrafter2 - VideoCrafter1 - VideoCrafter0.9 (Floor33) - Gen2-2023.12 - Gen2-2023.09 - PikaLab V1.0 - PikaLab - Hotshot-XL - Show-1 - Modelscope-XL - Zeroscope - Lavie - MoonValley - Dataset Structure: Generate videos are organized in the following structure (take videocrafter-v1.0 for an example) ## Acknowledgements and Citation This dataset is based on the EvalCrafter framework, which utilizes various open-source repositories for video generation evaluation. If you find this dataset helpful, please consider citing the original work: ## Explore More About Video Generation: - VideoCrafter1: Open Diffusion Models for High-Quality Video Generation - VideoCrafter2: Overcoming Data Limitations for High-Quality Video Diffusion Models
[ "# EvalCrafter Text-to-Video (ECTV) Dataset \n\nCode · Project Page · Huggingface Leaderboard · Paper@ArXiv · Prompt list\n\nWelcome to the ECTV dataset! This repository contains around 10000 videos generated by various methods using the Prompt list. These videos have been evaluated using the innovative EvalCrafter framework, which assesses generative models across visual, content, and motion qualities using 17 objective metrics and subjective user opinions.", "## Dataset Details \n\n- Paper: Read the Paper\n- Code: Code \n- Prompt List (700 prompts): Prompt list\n- Hugging Face Leaderboard: Huggingface Leaderboard\n- Project Page: Project Page\n- Methods Included in ECTV dataset:\n - VideoCrafter2\n - VideoCrafter1\n - VideoCrafter0.9 (Floor33)\n - Gen2-2023.12\n - Gen2-2023.09\n - PikaLab V1.0\n - PikaLab\n - Hotshot-XL\n - Show-1\n - Modelscope-XL\n - Zeroscope\n - Lavie\n - MoonValley\n- Dataset Structure:\n Generate videos are organized in the following structure (take videocrafter-v1.0 for an example)", "## Acknowledgements and Citation \n\nThis dataset is based on the EvalCrafter framework, which utilizes various open-source repositories for video generation evaluation. If you find this dataset helpful, please consider citing the original work:", "## Explore More About Video Generation:\n\n- VideoCrafter1: Open Diffusion Models for High-Quality Video Generation\n- VideoCrafter2: Overcoming Data Limitations for High-Quality Video Diffusion Models" ]
[ "TAGS\n#license-apache-2.0 #arxiv-2310.11440 #region-us \n", "# EvalCrafter Text-to-Video (ECTV) Dataset \n\nCode · Project Page · Huggingface Leaderboard · Paper@ArXiv · Prompt list\n\nWelcome to the ECTV dataset! This repository contains around 10000 videos generated by various methods using the Prompt list. These videos have been evaluated using the innovative EvalCrafter framework, which assesses generative models across visual, content, and motion qualities using 17 objective metrics and subjective user opinions.", "## Dataset Details \n\n- Paper: Read the Paper\n- Code: Code \n- Prompt List (700 prompts): Prompt list\n- Hugging Face Leaderboard: Huggingface Leaderboard\n- Project Page: Project Page\n- Methods Included in ECTV dataset:\n - VideoCrafter2\n - VideoCrafter1\n - VideoCrafter0.9 (Floor33)\n - Gen2-2023.12\n - Gen2-2023.09\n - PikaLab V1.0\n - PikaLab\n - Hotshot-XL\n - Show-1\n - Modelscope-XL\n - Zeroscope\n - Lavie\n - MoonValley\n- Dataset Structure:\n Generate videos are organized in the following structure (take videocrafter-v1.0 for an example)", "## Acknowledgements and Citation \n\nThis dataset is based on the EvalCrafter framework, which utilizes various open-source repositories for video generation evaluation. If you find this dataset helpful, please consider citing the original work:", "## Explore More About Video Generation:\n\n- VideoCrafter1: Open Diffusion Models for High-Quality Video Generation\n- VideoCrafter2: Overcoming Data Limitations for High-Quality Video Diffusion Models" ]
111bec4c71468e26cb138d161d972f9f19aefef0
# Dataset Card for Evaluation run of rishiraj/oswald-2x7b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [rishiraj/oswald-2x7b](https://huggingface.co/rishiraj/oswald-2x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_rishiraj__oswald-2x7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T10:06:31.070515](https://huggingface.co/datasets/open-llm-leaderboard/details_rishiraj__oswald-2x7b/blob/main/results_2024-01-11T10-06-31.070515.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6533236478227353, "acc_stderr": 0.03188761134034235, "acc_norm": 0.6556283671019292, "acc_norm_stderr": 0.032521516691180946, "mc1": 0.4357405140758874, "mc1_stderr": 0.017358345398863124, "mc2": 0.6006314442487943, "mc2_stderr": 0.015414089468190334 }, "harness|arc:challenge|25": { "acc": 0.6279863481228669, "acc_stderr": 0.01412459788184446, "acc_norm": 0.6646757679180887, "acc_norm_stderr": 0.013796182947785562 }, "harness|hellaswag|10": { "acc": 0.6697868950408286, "acc_stderr": 0.004693285694663836, "acc_norm": 0.8546106353316073, "acc_norm_stderr": 0.003517725787017748 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7236842105263158, "acc_stderr": 0.03639057569952928, "acc_norm": 0.7236842105263158, "acc_norm_stderr": 0.03639057569952928 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.027834912527544067, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.027834912527544067 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.035868792800803406, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.035868792800803406 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932263, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932263 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5574468085106383, "acc_stderr": 0.032469569197899575, "acc_norm": 0.5574468085106383, "acc_norm_stderr": 0.032469569197899575 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5175438596491229, "acc_stderr": 0.04700708033551038, "acc_norm": 0.5175438596491229, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.43915343915343913, "acc_stderr": 0.025559920550531, "acc_norm": 0.43915343915343913, "acc_norm_stderr": 0.025559920550531 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7903225806451613, "acc_stderr": 0.023157879349083522, "acc_norm": 0.7903225806451613, "acc_norm_stderr": 0.023157879349083522 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.03192271569548301, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.03192271569548301 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.029620227874790482, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.029620227874790482 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.02098685459328972, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.02098685459328972 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3037037037037037, "acc_stderr": 0.028037929969115, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.028037929969115 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7058823529411765, "acc_stderr": 0.029597329730978086, "acc_norm": 0.7058823529411765, "acc_norm_stderr": 0.029597329730978086 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3841059602649007, "acc_stderr": 0.03971301814719197, "acc_norm": 0.3841059602649007, "acc_norm_stderr": 0.03971301814719197 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8385321100917431, "acc_stderr": 0.015776239256163255, "acc_norm": 0.8385321100917431, "acc_norm_stderr": 0.015776239256163255 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455334, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455334 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.02553010046023349, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.02553010046023349 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7040358744394619, "acc_stderr": 0.03063659134869981, "acc_norm": 0.7040358744394619, "acc_norm_stderr": 0.03063659134869981 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8181818181818182, "acc_stderr": 0.03520893951097654, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.03520893951097654 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7975460122699386, "acc_stderr": 0.031570650789119005, "acc_norm": 0.7975460122699386, "acc_norm_stderr": 0.031570650789119005 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5267857142857143, "acc_stderr": 0.047389751192741546, "acc_norm": 0.5267857142857143, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822584, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822584 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.021901905115073325, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.021901905115073325 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8326947637292464, "acc_stderr": 0.013347327202920332, "acc_norm": 0.8326947637292464, "acc_norm_stderr": 0.013347327202920332 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7456647398843931, "acc_stderr": 0.023445826276545543, "acc_norm": 0.7456647398843931, "acc_norm_stderr": 0.023445826276545543 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3005586592178771, "acc_stderr": 0.01533456680625116, "acc_norm": 0.3005586592178771, "acc_norm_stderr": 0.01533456680625116 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7418300653594772, "acc_stderr": 0.02505850331695814, "acc_norm": 0.7418300653594772, "acc_norm_stderr": 0.02505850331695814 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632945, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632945 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600713, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600713 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47783572359843546, "acc_stderr": 0.012757683047716177, "acc_norm": 0.47783572359843546, "acc_norm_stderr": 0.012757683047716177 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6691176470588235, "acc_stderr": 0.02858270975389845, "acc_norm": 0.6691176470588235, "acc_norm_stderr": 0.02858270975389845 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6797385620915033, "acc_stderr": 0.018875682938069443, "acc_norm": 0.6797385620915033, "acc_norm_stderr": 0.018875682938069443 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.027979823538744543, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.027979823538744543 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.025538433368578337, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.025538433368578337 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.4357405140758874, "mc1_stderr": 0.017358345398863124, "mc2": 0.6006314442487943, "mc2_stderr": 0.015414089468190334 }, "harness|winogrande|5": { "acc": 0.7940015785319653, "acc_stderr": 0.011366474352008826 }, "harness|gsm8k|5": { "acc": 0.5981804397270659, "acc_stderr": 0.013504357787494042 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_rishiraj__oswald-2x7b
[ "region:us" ]
2024-01-11T10:08:46+00:00
{"pretty_name": "Evaluation run of rishiraj/oswald-2x7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [rishiraj/oswald-2x7b](https://huggingface.co/rishiraj/oswald-2x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rishiraj__oswald-2x7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T10:06:31.070515](https://huggingface.co/datasets/open-llm-leaderboard/details_rishiraj__oswald-2x7b/blob/main/results_2024-01-11T10-06-31.070515.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6533236478227353,\n \"acc_stderr\": 0.03188761134034235,\n \"acc_norm\": 0.6556283671019292,\n \"acc_norm_stderr\": 0.032521516691180946,\n \"mc1\": 0.4357405140758874,\n \"mc1_stderr\": 0.017358345398863124,\n \"mc2\": 0.6006314442487943,\n \"mc2_stderr\": 0.015414089468190334\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6279863481228669,\n \"acc_stderr\": 0.01412459788184446,\n \"acc_norm\": 0.6646757679180887,\n \"acc_norm_stderr\": 0.013796182947785562\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6697868950408286,\n \"acc_stderr\": 0.004693285694663836,\n \"acc_norm\": 0.8546106353316073,\n \"acc_norm_stderr\": 0.003517725787017748\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531,\n \"acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328972,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328972\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969115,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969115\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.029597329730978086,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.029597329730978086\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163255,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163255\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.03063659134869981,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.03063659134869981\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097654,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097654\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3005586592178771,\n \"acc_stderr\": 0.01533456680625116,\n \"acc_norm\": 0.3005586592178771,\n \"acc_norm_stderr\": 0.01533456680625116\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47783572359843546,\n \"acc_stderr\": 0.012757683047716177,\n \"acc_norm\": 0.47783572359843546,\n \"acc_norm_stderr\": 0.012757683047716177\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744543,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744543\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4357405140758874,\n \"mc1_stderr\": 0.017358345398863124,\n \"mc2\": 0.6006314442487943,\n \"mc2_stderr\": 0.015414089468190334\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7940015785319653,\n \"acc_stderr\": 0.011366474352008826\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5981804397270659,\n \"acc_stderr\": 0.013504357787494042\n }\n}\n```", "repo_url": "https://huggingface.co/rishiraj/oswald-2x7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|arc:challenge|25_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|gsm8k|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hellaswag|10_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T10-06-31.070515.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["**/details_harness|winogrande|5_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T10-06-31.070515.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T10_06_31.070515", "path": ["results_2024-01-11T10-06-31.070515.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T10-06-31.070515.parquet"]}]}]}
2024-01-11T10:09:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of rishiraj/oswald-2x7b Dataset automatically created during the evaluation run of model rishiraj/oswald-2x7b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T10:06:31.070515(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of rishiraj/oswald-2x7b\n\n\n\nDataset automatically created during the evaluation run of model rishiraj/oswald-2x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T10:06:31.070515(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of rishiraj/oswald-2x7b\n\n\n\nDataset automatically created during the evaluation run of model rishiraj/oswald-2x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T10:06:31.070515(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
d64b2d8a2161ee1b70fb835744d8cd6b7b323160
# Dataset Card for Evaluation run of flemmingmiguel/Mistrality-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [flemmingmiguel/Mistrality-7B](https://huggingface.co/flemmingmiguel/Mistrality-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_flemmingmiguel__Mistrality-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T10:13:19.328780](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__Mistrality-7B/blob/main/results_2024-01-11T10-13-19.328780.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6491508355371155, "acc_stderr": 0.031941087236083424, "acc_norm": 0.6501709046743914, "acc_norm_stderr": 0.03258488545513918, "mc1": 0.3929008567931457, "mc1_stderr": 0.017097248285233065, "mc2": 0.567957871171655, "mc2_stderr": 0.01545729191398638 }, "harness|arc:challenge|25": { "acc": 0.6279863481228669, "acc_stderr": 0.014124597881844461, "acc_norm": 0.6655290102389079, "acc_norm_stderr": 0.013787460322441374 }, "harness|hellaswag|10": { "acc": 0.6734714200358495, "acc_stderr": 0.004679847503411342, "acc_norm": 0.858195578570006, "acc_norm_stderr": 0.00348136484077097 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7302631578947368, "acc_stderr": 0.03611780560284898, "acc_norm": 0.7302631578947368, "acc_norm_stderr": 0.03611780560284898 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6867924528301886, "acc_stderr": 0.028544793319055326, "acc_norm": 0.6867924528301886, "acc_norm_stderr": 0.028544793319055326 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.034765901043041336, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.034765901043041336 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.630057803468208, "acc_stderr": 0.036812296333943194, "acc_norm": 0.630057803468208, "acc_norm_stderr": 0.036812296333943194 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5702127659574469, "acc_stderr": 0.03236214467715564, "acc_norm": 0.5702127659574469, "acc_norm_stderr": 0.03236214467715564 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5175438596491229, "acc_stderr": 0.04700708033551038, "acc_norm": 0.5175438596491229, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878151, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878151 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41798941798941797, "acc_stderr": 0.025402555503260912, "acc_norm": 0.41798941798941797, "acc_norm_stderr": 0.025402555503260912 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.023287665127268545, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.023287665127268545 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586815, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.020986854593289733, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.020986854593289733 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.658974358974359, "acc_stderr": 0.02403548967633508, "acc_norm": 0.658974358974359, "acc_norm_stderr": 0.02403548967633508 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34444444444444444, "acc_stderr": 0.02897264888484427, "acc_norm": 0.34444444444444444, "acc_norm_stderr": 0.02897264888484427 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6974789915966386, "acc_stderr": 0.029837962388291932, "acc_norm": 0.6974789915966386, "acc_norm_stderr": 0.029837962388291932 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.01563002297009244, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.01563002297009244 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.02584501798692692, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.02584501798692692 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8143459915611815, "acc_stderr": 0.025310495376944856, "acc_norm": 0.8143459915611815, "acc_norm_stderr": 0.025310495376944856 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.034981493854624734, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.034981493854624734 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742178, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742178 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489123, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.039891398595317706, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.039891398595317706 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.02280138253459753, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.02280138253459753 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8288633461047255, "acc_stderr": 0.013468201614066307, "acc_norm": 0.8288633461047255, "acc_norm_stderr": 0.013468201614066307 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069367, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069367 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3452513966480447, "acc_stderr": 0.015901432608930354, "acc_norm": 0.3452513966480447, "acc_norm_stderr": 0.015901432608930354 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.761437908496732, "acc_stderr": 0.02440439492808787, "acc_norm": 0.761437908496732, "acc_norm_stderr": 0.02440439492808787 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632945, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632945 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.0242885336377261, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.0242885336377261 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4661016949152542, "acc_stderr": 0.01274085387294983, "acc_norm": 0.4661016949152542, "acc_norm_stderr": 0.01274085387294983 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7242647058823529, "acc_stderr": 0.027146271936625166, "acc_norm": 0.7242647058823529, "acc_norm_stderr": 0.027146271936625166 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6748366013071896, "acc_stderr": 0.018950886770806304, "acc_norm": 0.6748366013071896, "acc_norm_stderr": 0.018950886770806304 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.04494290866252091, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.04494290866252091 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.028123429335142773, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.028123429335142773 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.02553843336857833, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.02553843336857833 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.847953216374269, "acc_stderr": 0.027539122889061463, "acc_norm": 0.847953216374269, "acc_norm_stderr": 0.027539122889061463 }, "harness|truthfulqa:mc|0": { "mc1": 0.3929008567931457, "mc1_stderr": 0.017097248285233065, "mc2": 0.567957871171655, "mc2_stderr": 0.01545729191398638 }, "harness|winogrande|5": { "acc": 0.7932123125493291, "acc_stderr": 0.011382566829235798 }, "harness|gsm8k|5": { "acc": 0.6671721000758151, "acc_stderr": 0.01297989249659828 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_flemmingmiguel__Mistrality-7B
[ "region:us" ]
2024-01-11T10:15:39+00:00
{"pretty_name": "Evaluation run of flemmingmiguel/Mistrality-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [flemmingmiguel/Mistrality-7B](https://huggingface.co/flemmingmiguel/Mistrality-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_flemmingmiguel__Mistrality-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T10:13:19.328780](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__Mistrality-7B/blob/main/results_2024-01-11T10-13-19.328780.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6491508355371155,\n \"acc_stderr\": 0.031941087236083424,\n \"acc_norm\": 0.6501709046743914,\n \"acc_norm_stderr\": 0.03258488545513918,\n \"mc1\": 0.3929008567931457,\n \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.567957871171655,\n \"mc2_stderr\": 0.01545729191398638\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6279863481228669,\n \"acc_stderr\": 0.014124597881844461,\n \"acc_norm\": 0.6655290102389079,\n \"acc_norm_stderr\": 0.013787460322441374\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6734714200358495,\n \"acc_stderr\": 0.004679847503411342,\n \"acc_norm\": 0.858195578570006,\n \"acc_norm_stderr\": 0.00348136484077097\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.034765901043041336,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.034765901043041336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3452513966480447,\n \"acc_stderr\": 0.015901432608930354,\n \"acc_norm\": 0.3452513966480447,\n \"acc_norm_stderr\": 0.015901432608930354\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n \"acc_stderr\": 0.01274085387294983,\n \"acc_norm\": 0.4661016949152542,\n \"acc_norm_stderr\": 0.01274085387294983\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7242647058823529,\n \"acc_stderr\": 0.027146271936625166,\n \"acc_norm\": 0.7242647058823529,\n \"acc_norm_stderr\": 0.027146271936625166\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806304,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806304\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061463,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061463\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3929008567931457,\n \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.567957871171655,\n \"mc2_stderr\": 0.01545729191398638\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235798\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6671721000758151,\n \"acc_stderr\": 0.01297989249659828\n }\n}\n```", "repo_url": "https://huggingface.co/flemmingmiguel/Mistrality-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|arc:challenge|25_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|gsm8k|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hellaswag|10_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T10-13-19.328780.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["**/details_harness|winogrande|5_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T10-13-19.328780.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T10_13_19.328780", "path": ["results_2024-01-11T10-13-19.328780.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T10-13-19.328780.parquet"]}]}]}
2024-01-11T10:15:59+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of flemmingmiguel/Mistrality-7B Dataset automatically created during the evaluation run of model flemmingmiguel/Mistrality-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T10:13:19.328780(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of flemmingmiguel/Mistrality-7B\n\n\n\nDataset automatically created during the evaluation run of model flemmingmiguel/Mistrality-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T10:13:19.328780(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of flemmingmiguel/Mistrality-7B\n\n\n\nDataset automatically created during the evaluation run of model flemmingmiguel/Mistrality-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T10:13:19.328780(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
df55257e423442363b41088102dbddb0abd84bb4
<img src="https://huggingface.co/datasets/nyanko7/danbooru2023/resolve/main/cover.webp" alt="cover" width="750"/> # Danbooru2023: A Large-Scale Crowdsourced and Tagged Anime Illustration Dataset <!-- Provide a quick summary of the dataset. --> Danbooru2023 is a large-scale anime image dataset with over 5 million images contributed and annotated in detail by an enthusiast community. Image tags cover aspects like characters, scenes, copyrights, artists, etc with an average of 30 tags per image. Danbooru is a veteran anime image board with high-quality images and extensive tag metadata. The dataset can be used to train image classification, multi-label tagging, character detection, generative models, and other computer vision tasks. - **Shared by:** Nyanko Devs - **Language(s):** English, Japanese - **License:** MIT This dataset is built on the top of [danbooru2021](https://gwern.net/danbooru2021). We expands the dataset to include images up to ID #6,857,737, adding over 1.8 million additional images and total size is now approximately 8 terabytes (8,000 GB). ## Use ## Format The goal of the dataset is to be as easy as possible to use immediately, avoiding obscure file formats, while allowing simultaneous research & seeding of the torrent, with easy updates. Images are provided in the full original form (be that JPG, PNG, GIF or otherwise) for reference/archival purposes, and bucketed into 1000 subdirectories 0000–0999 (0-padded), which is the Danbooru ID modulo 1000 (ie. all images in 0999/ have an ID ending in ‘999’); IDs can be turned into paths by dividing & padding (eg. in Bash, BUCKET=$(printf "%04d" $(( ID % 1000 )) )) and then the file is at {original,512px}/$BUCKET/$ID.$EXT. The reason for the bucketing is that a single directory would cause pathological filesystem performance, and modulo ID is a simple hash which spreads images evenly without requiring additional future directories to be made or a filesystem IO to check where the file is. The ID is not zero-padded and files end in the relevant extension, hence the file layout looks like this: ```bash $ tree / | less / ├── danbooru2023 -> /mnt/diffusionstorage/workspace/danbooru/ │ ├── metadata │ ├── readme.md │ ├── original │ │ ├── 0000 -> data-0000.tar │ │ ├── 0001 -> data-0001.tar │ │ │ ├── 10001.jpg │ │ │ ├── 210001.png │ │ │ ├── 3120001.webp │ │ │ ├── 6513001.jpg ``` Currently represented file extensions are: avi/bmp/gif/html/jpeg/jpg/mp3/mp4/mpg/pdf/png/rar/swf/webm/wmv/zip. Raw original files are treacherous. Be careful if working with the original dataset. There are many odd files: truncated, non-sRGB colorspace, wrong file extensions (eg. some PNGs have .jpg extensions like original/0146/1525146.jpg or original/0558/1422558.jpg), etc.
jpft/danbooru2023
[ "task_categories:image-classification", "task_categories:image-to-image", "task_categories:text-to-image", "size_categories:1M<n<10M", "language:en", "language:ja", "license:mit", "region:us" ]
2024-01-11T10:28:25+00:00
{"language": ["en", "ja"], "license": "mit", "size_categories": ["1M<n<10M"], "task_categories": ["image-classification", "image-to-image", "text-to-image"], "pretty_name": "danbooru2023", "viewer": false}
2024-01-11T10:28:25+00:00
[]
[ "en", "ja" ]
TAGS #task_categories-image-classification #task_categories-image-to-image #task_categories-text-to-image #size_categories-1M<n<10M #language-English #language-Japanese #license-mit #region-us
<img src="URL alt="cover" width="750"/> # Danbooru2023: A Large-Scale Crowdsourced and Tagged Anime Illustration Dataset Danbooru2023 is a large-scale anime image dataset with over 5 million images contributed and annotated in detail by an enthusiast community. Image tags cover aspects like characters, scenes, copyrights, artists, etc with an average of 30 tags per image. Danbooru is a veteran anime image board with high-quality images and extensive tag metadata. The dataset can be used to train image classification, multi-label tagging, character detection, generative models, and other computer vision tasks. - Shared by: Nyanko Devs - Language(s): English, Japanese - License: MIT This dataset is built on the top of danbooru2021. We expands the dataset to include images up to ID #6,857,737, adding over 1.8 million additional images and total size is now approximately 8 terabytes (8,000 GB). ## Use ## Format The goal of the dataset is to be as easy as possible to use immediately, avoiding obscure file formats, while allowing simultaneous research & seeding of the torrent, with easy updates. Images are provided in the full original form (be that JPG, PNG, GIF or otherwise) for reference/archival purposes, and bucketed into 1000 subdirectories 0000–0999 (0-padded), which is the Danbooru ID modulo 1000 (ie. all images in 0999/ have an ID ending in ‘999’); IDs can be turned into paths by dividing & padding (eg. in Bash, BUCKET=$(printf "%04d" $(( ID % 1000 )) )) and then the file is at {original,512px}/$BUCKET/$ID.$EXT. The reason for the bucketing is that a single directory would cause pathological filesystem performance, and modulo ID is a simple hash which spreads images evenly without requiring additional future directories to be made or a filesystem IO to check where the file is. The ID is not zero-padded and files end in the relevant extension, hence the file layout looks like this: Currently represented file extensions are: avi/bmp/gif/html/jpeg/jpg/mp3/mp4/mpg/pdf/png/rar/swf/webm/wmv/zip. Raw original files are treacherous. Be careful if working with the original dataset. There are many odd files: truncated, non-sRGB colorspace, wrong file extensions (eg. some PNGs have .jpg extensions like original/0146/URL or original/0558/URL), etc.
[ "# Danbooru2023: A Large-Scale Crowdsourced and Tagged Anime Illustration Dataset\n\n\n\nDanbooru2023 is a large-scale anime image dataset with over 5 million images contributed and annotated in detail by an enthusiast community. Image tags cover aspects like characters, scenes, copyrights, artists, etc with an average of 30 tags per image.\n\nDanbooru is a veteran anime image board with high-quality images and extensive tag metadata. The dataset can be used to train image classification, multi-label tagging, character detection, generative models, and other computer vision tasks.\n\n- Shared by: Nyanko Devs\n- Language(s): English, Japanese\n- License: MIT\n\nThis dataset is built on the top of danbooru2021. We expands the dataset to include images up to ID #6,857,737, adding over 1.8 million additional images and total size is now approximately 8 terabytes (8,000 GB).", "## Use", "## Format\n\nThe goal of the dataset is to be as easy as possible to use immediately, avoiding obscure file formats, while allowing simultaneous research & seeding of the torrent, with easy updates.\n\nImages are provided in the full original form (be that JPG, PNG, GIF or otherwise) for reference/archival purposes, and bucketed into 1000 subdirectories 0000–0999 (0-padded), which is the Danbooru ID modulo 1000 (ie. all images in 0999/ have an ID ending in ‘999’); IDs can be turned into paths by dividing & padding (eg. in Bash, BUCKET=$(printf \"%04d\" $(( ID % 1000 )) )) and then the file is at {original,512px}/$BUCKET/$ID.$EXT. \n\nThe reason for the bucketing is that a single directory would cause pathological filesystem performance, and modulo ID is a simple hash which spreads images evenly without requiring additional future directories to be made or a filesystem IO to check where the file is. The ID is not zero-padded and files end in the relevant extension, hence the file layout looks like this:\n\n\n \nCurrently represented file extensions are: avi/bmp/gif/html/jpeg/jpg/mp3/mp4/mpg/pdf/png/rar/swf/webm/wmv/zip. \n\nRaw original files are treacherous. Be careful if working with the original dataset. There are many odd files: truncated, non-sRGB colorspace, wrong file extensions (eg. some PNGs have .jpg extensions like original/0146/URL or original/0558/URL), etc." ]
[ "TAGS\n#task_categories-image-classification #task_categories-image-to-image #task_categories-text-to-image #size_categories-1M<n<10M #language-English #language-Japanese #license-mit #region-us \n", "# Danbooru2023: A Large-Scale Crowdsourced and Tagged Anime Illustration Dataset\n\n\n\nDanbooru2023 is a large-scale anime image dataset with over 5 million images contributed and annotated in detail by an enthusiast community. Image tags cover aspects like characters, scenes, copyrights, artists, etc with an average of 30 tags per image.\n\nDanbooru is a veteran anime image board with high-quality images and extensive tag metadata. The dataset can be used to train image classification, multi-label tagging, character detection, generative models, and other computer vision tasks.\n\n- Shared by: Nyanko Devs\n- Language(s): English, Japanese\n- License: MIT\n\nThis dataset is built on the top of danbooru2021. We expands the dataset to include images up to ID #6,857,737, adding over 1.8 million additional images and total size is now approximately 8 terabytes (8,000 GB).", "## Use", "## Format\n\nThe goal of the dataset is to be as easy as possible to use immediately, avoiding obscure file formats, while allowing simultaneous research & seeding of the torrent, with easy updates.\n\nImages are provided in the full original form (be that JPG, PNG, GIF or otherwise) for reference/archival purposes, and bucketed into 1000 subdirectories 0000–0999 (0-padded), which is the Danbooru ID modulo 1000 (ie. all images in 0999/ have an ID ending in ‘999’); IDs can be turned into paths by dividing & padding (eg. in Bash, BUCKET=$(printf \"%04d\" $(( ID % 1000 )) )) and then the file is at {original,512px}/$BUCKET/$ID.$EXT. \n\nThe reason for the bucketing is that a single directory would cause pathological filesystem performance, and modulo ID is a simple hash which spreads images evenly without requiring additional future directories to be made or a filesystem IO to check where the file is. The ID is not zero-padded and files end in the relevant extension, hence the file layout looks like this:\n\n\n \nCurrently represented file extensions are: avi/bmp/gif/html/jpeg/jpg/mp3/mp4/mpg/pdf/png/rar/swf/webm/wmv/zip. \n\nRaw original files are treacherous. Be careful if working with the original dataset. There are many odd files: truncated, non-sRGB colorspace, wrong file extensions (eg. some PNGs have .jpg extensions like original/0146/URL or original/0558/URL), etc." ]
4e43991035ac62e44ebbb0ba6bab7d26229efca7
# The ICL consistency test This 🤗 dataset provides data for the [GenBench CBT task 'The ICL consistency test'](https://github.com/GenBench/genbench_cbt/tree/main/src/genbench/tasks/icl_consistency_test). The ICL consistency test measures the consistency of LLM predictions on the same data points across many different equivalent prompting setups. The score in the associated metric (Cohen's kappa) can be understood as a measure of a model's prediction consistency in the face of task-irrelevant information. For an easy evaluation of any 🤗 models, we refer to the code provided in the GenBench task. For in-depth information on the task, we refer to the associated publications ([Weber et al., 2023](https://arxiv.org/abs/2312.04945),[2023](https://aclanthology.org/2023.conll-1.20/)) and the respective GenBench [doc.md](https://github.com/GenBench/genbench_cbt/blob/main/src/genbench/tasks/icl_consistency_test/doc.md). Evaluation on the relevant metrics can be done via the _example_evaluation.py_ script in the [GenBench repository](https://github.com/GenBench/genbench_cbt/blob/main/src/genbench/tasks/icl_consistency_test/). ### Dataset Description _Abstract_: The ICL consistency test measures the consistency of LLM predictions on the same data points across many different prompting setups. Different setups are defined by "factors". On the one hand, factors can be specific attributes of the used prompt (e.g. the number of examples the model is presented with ["n_shots"] or the type of instructions that were used to wrap a specific datapoint ["Instructions"]). On the other hand, the analysis can also be augmented by factors that are related to the way a model is evaluated (e.g. whether a model is calibrated) or the type of model that is evaluated (e.g. the number of parameters or instructions tuning). These external factors can be added to the analysis by using the task.add_factor() method. The output metric is Cohen's kappa for each factor across all different conditions. A kappa value close to 1 indicates that the factors do not change the model prediction, while a factor close to 0 strongly changes model predictions. The ICL consistency test has two subtasks, one evaluating the ANLI-dataset ([Nie et al., 2019](https://aclanthology.org/N18-1101/)); the other the MNLI-dataset ([Wang et al., 2017](https://aclanthology.org/N18-1101/)). _Size_: Each subtask contains 57600 when using the full 600 data_IDs. The user can choose to reduce the number of evaluated data_IDs. - **Curated by:** - resampling and arrangement was done by [Weber et al., 2023](https://arxiv.org/abs/2312.04945),[2023](https://aclanthology.org/2023.conll-1.20/); - original data were curated by [Nie et al., 2019](https://aclanthology.org/N18-1101/) (ANLI) and [Wang et al., 2017](https://aclanthology.org/N18-1101/) (MNLI); - templates were curated by [Bach et al., 2022](https://aclanthology.org/2022.acl-demo.9/) (promptsource). - **Language:** English ### Dataset Sources (basic links) - **Repository:** Data files on [github](https://github.com/LucWeber/icl_consistency_data). - **Paper:** [Weber et al., 2023](https://arxiv.org/abs/2312.04945),[2023](https://aclanthology.org/2023.conll-1.20/). - **Demo:** Find pre-implemented code to evaluate any 🤗 model on [github](https://github.com/GenBench/genbench_cbt/blob/main/src/genbench/tasks/icl_consistency_test/example_evaluation.py). ## Uses In prompting, models are sensitive to task-irrelevant information in their prompt. This test can be used to quantify this sensitivity of any 🤗 model. The ICL consistency test does this by measuring a model's prediction consistency across many different semantically equivalent prompting setups. ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [_TBA_] ## Dataset Creation The data is a sample from the [MNLI](https://aclanthology.org/N18-1101/) and [ANLI](https://aclanthology.org/2020.acl-main.441/) datasets as well as prompt templates from [promptsource](https://aclanthology.org/2022.acl-demo.9/). Please refer to the original publications's documentation for detailed information on dataset creation. ## Bias, Risks, and Limitations This dataset contains data from the [MNLI](https://aclanthology.org/N18-1101/) and [ANLI](https://aclanthology.org/2020.acl-main.441/) datasets and adheres to the same biases, risks and limitations. ### Recommendations We identify the following limitations of the consistency test: 1. The number of factors is limited and does not cover all possible factors that might influence the predictions. We limited ourselves to factors we deem relevant, to ensure fast evaluation. 2. Currently, the test is only implemented for the ANLI- and MNLI-datasets. 3. Factors that are external to the dataset but should be considered in the analysis (e.g. _instruction tuning_ or _calibration_) have to be manually added by the user using the task.add_factor() method (please use the GenBench implementation of the dataset. You can find it on [github](https://github.com/GenBench/genbench_cbt/tree/main/src/genbench/tasks/icl_consistency_test)). ## Citation This dataset was used in the following publications. If you use it, please consider citing the following references: **BibTeX:** ``` @inproceedings{weber2023mind, title={Mind the instructions: a holistic evaluation of consistency and interactions in prompt-based learning}, author={Weber, Lucas and Bruni, Elia and Hupkes, Dieuwke}, booktitle={Proceedings of the 27th Conference on Computational Natural Language Learning (CoNLL)}, pages={294--313}, year={2023} } ``` ``` @article{weber2023icl, title={The ICL Consistency Test}, author={Weber, Lucas and Bruni, Elia and Hupkes, Dieuwke}, journal={arXiv preprint arXiv:2312.04945}, year={2023} } ``` ## Dataset Card Authors [Lucas Weber](https://lucweber.github.io/) ## Dataset Card Contact [email protected]
LucasWeber/icl_consistency_test
[ "task_categories:text-classification", "size_categories:100K<n<1M", "language:en", "arxiv:2312.04945", "region:us" ]
2024-01-11T10:36:20+00:00
{"language": ["en"], "size_categories": ["100K<n<1M"], "task_categories": ["text-classification"], "pretty_name": "The ICL consistency test"}
2024-01-11T13:27:47+00:00
[ "2312.04945" ]
[ "en" ]
TAGS #task_categories-text-classification #size_categories-100K<n<1M #language-English #arxiv-2312.04945 #region-us
# The ICL consistency test This dataset provides data for the GenBench CBT task 'The ICL consistency test'. The ICL consistency test measures the consistency of LLM predictions on the same data points across many different equivalent prompting setups. The score in the associated metric (Cohen's kappa) can be understood as a measure of a model's prediction consistency in the face of task-irrelevant information. For an easy evaluation of any models, we refer to the code provided in the GenBench task. For in-depth information on the task, we refer to the associated publications (Weber et al., 2023,2023) and the respective GenBench URL. Evaluation on the relevant metrics can be done via the _example_evaluation.py_ script in the GenBench repository. ### Dataset Description _Abstract_: The ICL consistency test measures the consistency of LLM predictions on the same data points across many different prompting setups. Different setups are defined by "factors". On the one hand, factors can be specific attributes of the used prompt (e.g. the number of examples the model is presented with ["n_shots"] or the type of instructions that were used to wrap a specific datapoint ["Instructions"]). On the other hand, the analysis can also be augmented by factors that are related to the way a model is evaluated (e.g. whether a model is calibrated) or the type of model that is evaluated (e.g. the number of parameters or instructions tuning). These external factors can be added to the analysis by using the task.add_factor() method. The output metric is Cohen's kappa for each factor across all different conditions. A kappa value close to 1 indicates that the factors do not change the model prediction, while a factor close to 0 strongly changes model predictions. The ICL consistency test has two subtasks, one evaluating the ANLI-dataset (Nie et al., 2019); the other the MNLI-dataset (Wang et al., 2017). _Size_: Each subtask contains 57600 when using the full 600 data_IDs. The user can choose to reduce the number of evaluated data_IDs. - Curated by: - resampling and arrangement was done by Weber et al., 2023,2023; - original data were curated by Nie et al., 2019 (ANLI) and Wang et al., 2017 (MNLI); - templates were curated by Bach et al., 2022 (promptsource). - Language: English ### Dataset Sources (basic links) - Repository: Data files on github. - Paper: Weber et al., 2023,2023. - Demo: Find pre-implemented code to evaluate any model on github. ## Uses In prompting, models are sensitive to task-irrelevant information in their prompt. This test can be used to quantify this sensitivity of any model. The ICL consistency test does this by measuring a model's prediction consistency across many different semantically equivalent prompting setups. ## Dataset Structure [_TBA_] ## Dataset Creation The data is a sample from the MNLI and ANLI datasets as well as prompt templates from promptsource. Please refer to the original publications's documentation for detailed information on dataset creation. ## Bias, Risks, and Limitations This dataset contains data from the MNLI and ANLI datasets and adheres to the same biases, risks and limitations. ### Recommendations We identify the following limitations of the consistency test: 1. The number of factors is limited and does not cover all possible factors that might influence the predictions. We limited ourselves to factors we deem relevant, to ensure fast evaluation. 2. Currently, the test is only implemented for the ANLI- and MNLI-datasets. 3. Factors that are external to the dataset but should be considered in the analysis (e.g. _instruction tuning_ or _calibration_) have to be manually added by the user using the task.add_factor() method (please use the GenBench implementation of the dataset. You can find it on github). This dataset was used in the following publications. If you use it, please consider citing the following references: BibTeX: ## Dataset Card Authors Lucas Weber ## Dataset Card Contact lucasweber000@URL
[ "# The ICL consistency test\n\nThis dataset provides data for the GenBench CBT task 'The ICL consistency test'.\nThe ICL consistency test measures the consistency of LLM predictions on the same data points across many different equivalent prompting setups. \nThe score in the associated metric (Cohen's kappa) can be understood as a measure of a model's prediction consistency in the face of task-irrelevant information.\n\nFor an easy evaluation of any models, we refer to the code provided in the GenBench task. For in-depth information on the task, we refer to the associated \npublications (Weber et al., 2023,2023) and the respective GenBench URL.\n\nEvaluation on the relevant metrics can be done via the _example_evaluation.py_ script in the GenBench repository.", "### Dataset Description\n\n_Abstract_: The ICL consistency test measures the consistency of LLM predictions on the same data points across many different prompting setups. Different setups are defined by \"factors\". \nOn the one hand, factors can be specific attributes of the used prompt (e.g. the number of examples the model is presented with [\"n_shots\"] or the type of instructions \nthat were used to wrap a specific datapoint [\"Instructions\"]). On the other hand, the analysis can also be augmented by factors that are related to the way a model is \nevaluated (e.g. whether a model is calibrated) or the type of model that is evaluated (e.g. the number of parameters or instructions tuning). These external factors can \nbe added to the analysis by using the task.add_factor() method. The output metric is Cohen's kappa for each factor across all different conditions. A kappa value close to \n1 indicates that the factors do not change the model prediction, while a factor close to 0 strongly changes model predictions. The ICL consistency test has two subtasks, \none evaluating the ANLI-dataset (Nie et al., 2019); the other the MNLI-dataset (Wang et al., 2017).\n\n_Size_: Each subtask contains 57600 when using the full 600 data_IDs. The user can choose to reduce the number of evaluated data_IDs.\n\n- Curated by:\n - resampling and arrangement was done by Weber et al., 2023,2023;\n - original data were curated by Nie et al., 2019 (ANLI) and Wang et al., 2017 (MNLI);\n - templates were curated by Bach et al., 2022 (promptsource). \n- Language: English", "### Dataset Sources (basic links)\n\n- Repository: Data files on github.\n- Paper: Weber et al., 2023,2023.\n- Demo: Find pre-implemented code to evaluate any model on github.", "## Uses\n\nIn prompting, models are sensitive to task-irrelevant information in their prompt. This test can be used to quantify this sensitivity of any model. The ICL consistency test does this by measuring a model's prediction consistency across many different semantically equivalent prompting setups.", "## Dataset Structure\n\n\n\n[_TBA_]", "## Dataset Creation\n\nThe data is a sample from the MNLI and ANLI datasets as well as prompt templates from promptsource. \nPlease refer to the original publications's documentation for detailed information on dataset creation.", "## Bias, Risks, and Limitations\n\nThis dataset contains data from the MNLI and ANLI datasets and adheres to the same biases, risks and limitations.", "### Recommendations\n\nWe identify the following limitations of the consistency test:\n\n1. The number of factors is limited and does not cover all possible factors that might influence the predictions. We limited ourselves to factors we deem relevant, to ensure fast evaluation.\n\n2. Currently, the test is only implemented for the ANLI- and MNLI-datasets.\n\n3. Factors that are external to the dataset but should be considered in the analysis (e.g. _instruction tuning_ or _calibration_) have to be manually added by the user\n using the task.add_factor() method (please use the GenBench implementation of the dataset. You can find it on github).\n\n\nThis dataset was used in the following publications. If you use it, please consider citing the following references:\n\nBibTeX:", "## Dataset Card Authors\nLucas Weber", "## Dataset Card Contact\nlucasweber000@URL" ]
[ "TAGS\n#task_categories-text-classification #size_categories-100K<n<1M #language-English #arxiv-2312.04945 #region-us \n", "# The ICL consistency test\n\nThis dataset provides data for the GenBench CBT task 'The ICL consistency test'.\nThe ICL consistency test measures the consistency of LLM predictions on the same data points across many different equivalent prompting setups. \nThe score in the associated metric (Cohen's kappa) can be understood as a measure of a model's prediction consistency in the face of task-irrelevant information.\n\nFor an easy evaluation of any models, we refer to the code provided in the GenBench task. For in-depth information on the task, we refer to the associated \npublications (Weber et al., 2023,2023) and the respective GenBench URL.\n\nEvaluation on the relevant metrics can be done via the _example_evaluation.py_ script in the GenBench repository.", "### Dataset Description\n\n_Abstract_: The ICL consistency test measures the consistency of LLM predictions on the same data points across many different prompting setups. Different setups are defined by \"factors\". \nOn the one hand, factors can be specific attributes of the used prompt (e.g. the number of examples the model is presented with [\"n_shots\"] or the type of instructions \nthat were used to wrap a specific datapoint [\"Instructions\"]). On the other hand, the analysis can also be augmented by factors that are related to the way a model is \nevaluated (e.g. whether a model is calibrated) or the type of model that is evaluated (e.g. the number of parameters or instructions tuning). These external factors can \nbe added to the analysis by using the task.add_factor() method. The output metric is Cohen's kappa for each factor across all different conditions. A kappa value close to \n1 indicates that the factors do not change the model prediction, while a factor close to 0 strongly changes model predictions. The ICL consistency test has two subtasks, \none evaluating the ANLI-dataset (Nie et al., 2019); the other the MNLI-dataset (Wang et al., 2017).\n\n_Size_: Each subtask contains 57600 when using the full 600 data_IDs. The user can choose to reduce the number of evaluated data_IDs.\n\n- Curated by:\n - resampling and arrangement was done by Weber et al., 2023,2023;\n - original data were curated by Nie et al., 2019 (ANLI) and Wang et al., 2017 (MNLI);\n - templates were curated by Bach et al., 2022 (promptsource). \n- Language: English", "### Dataset Sources (basic links)\n\n- Repository: Data files on github.\n- Paper: Weber et al., 2023,2023.\n- Demo: Find pre-implemented code to evaluate any model on github.", "## Uses\n\nIn prompting, models are sensitive to task-irrelevant information in their prompt. This test can be used to quantify this sensitivity of any model. The ICL consistency test does this by measuring a model's prediction consistency across many different semantically equivalent prompting setups.", "## Dataset Structure\n\n\n\n[_TBA_]", "## Dataset Creation\n\nThe data is a sample from the MNLI and ANLI datasets as well as prompt templates from promptsource. \nPlease refer to the original publications's documentation for detailed information on dataset creation.", "## Bias, Risks, and Limitations\n\nThis dataset contains data from the MNLI and ANLI datasets and adheres to the same biases, risks and limitations.", "### Recommendations\n\nWe identify the following limitations of the consistency test:\n\n1. The number of factors is limited and does not cover all possible factors that might influence the predictions. We limited ourselves to factors we deem relevant, to ensure fast evaluation.\n\n2. Currently, the test is only implemented for the ANLI- and MNLI-datasets.\n\n3. Factors that are external to the dataset but should be considered in the analysis (e.g. _instruction tuning_ or _calibration_) have to be manually added by the user\n using the task.add_factor() method (please use the GenBench implementation of the dataset. You can find it on github).\n\n\nThis dataset was used in the following publications. If you use it, please consider citing the following references:\n\nBibTeX:", "## Dataset Card Authors\nLucas Weber", "## Dataset Card Contact\nlucasweber000@URL" ]
f83d83ba3b9e087b915aec1b93df2398524c0e26
# Dataset Card for Evaluation run of TriadParty/deepmoney-34b-200k-base <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [TriadParty/deepmoney-34b-200k-base](https://huggingface.co/TriadParty/deepmoney-34b-200k-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TriadParty__deepmoney-34b-200k-base", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T10:43:46.493782](https://huggingface.co/datasets/open-llm-leaderboard/details_TriadParty__deepmoney-34b-200k-base/blob/main/results_2024-01-11T10-43-46.493782.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7256546951493829, "acc_stderr": 0.028872391946942647, "acc_norm": 0.7403326146455504, "acc_norm_stderr": 0.029642883506415262, "mc1": 0.3072215422276622, "mc1_stderr": 0.01615020132132301, "mc2": 0.4593002272815368, "mc2_stderr": 0.014606974103928553 }, "harness|arc:challenge|25": { "acc": 0.60580204778157, "acc_stderr": 0.014280522667467327, "acc_norm": 0.6399317406143344, "acc_norm_stderr": 0.014027516814585188 }, "harness|hellaswag|10": { "acc": 0.6435968930491934, "acc_stderr": 0.00477957440277138, "acc_norm": 0.8386775542720574, "acc_norm_stderr": 0.0036707636737929607 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6962962962962963, "acc_stderr": 0.039725528847851375, "acc_norm": 0.6962962962962963, "acc_norm_stderr": 0.039725528847851375 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8355263157894737, "acc_stderr": 0.030167533468632726, "acc_norm": 0.8355263157894737, "acc_norm_stderr": 0.030167533468632726 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.78, "acc_stderr": 0.04163331998932262, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932262 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.8188679245283019, "acc_stderr": 0.023702963526757798, "acc_norm": 0.8188679245283019, "acc_norm_stderr": 0.023702963526757798 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8611111111111112, "acc_stderr": 0.028919802956134912, "acc_norm": 0.8611111111111112, "acc_norm_stderr": 0.028919802956134912 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7167630057803468, "acc_stderr": 0.034355680560478746, "acc_norm": 0.7167630057803468, "acc_norm_stderr": 0.034355680560478746 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.46078431372549017, "acc_stderr": 0.049598599663841815, "acc_norm": 0.46078431372549017, "acc_norm_stderr": 0.049598599663841815 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.81, "acc_stderr": 0.039427724440366234, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7617021276595745, "acc_stderr": 0.027851252973889774, "acc_norm": 0.7617021276595745, "acc_norm_stderr": 0.027851252973889774 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5263157894736842, "acc_stderr": 0.046970851366478626, "acc_norm": 0.5263157894736842, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7379310344827587, "acc_stderr": 0.036646663372252565, "acc_norm": 0.7379310344827587, "acc_norm_stderr": 0.036646663372252565 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.6005291005291006, "acc_stderr": 0.02522545028406793, "acc_norm": 0.6005291005291006, "acc_norm_stderr": 0.02522545028406793 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5079365079365079, "acc_stderr": 0.044715725362943486, "acc_norm": 0.5079365079365079, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.57, "acc_stderr": 0.04975698519562427, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562427 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8935483870967742, "acc_stderr": 0.01754510295165663, "acc_norm": 0.8935483870967742, "acc_norm_stderr": 0.01754510295165663 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6403940886699507, "acc_stderr": 0.03376458246509568, "acc_norm": 0.6403940886699507, "acc_norm_stderr": 0.03376458246509568 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8424242424242424, "acc_stderr": 0.028450388805284357, "acc_norm": 0.8424242424242424, "acc_norm_stderr": 0.028450388805284357 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9141414141414141, "acc_stderr": 0.01996022556317289, "acc_norm": 0.9141414141414141, "acc_norm_stderr": 0.01996022556317289 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9792746113989638, "acc_stderr": 0.010281417011909029, "acc_norm": 0.9792746113989638, "acc_norm_stderr": 0.010281417011909029 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7923076923076923, "acc_stderr": 0.020567539567246797, "acc_norm": 0.7923076923076923, "acc_norm_stderr": 0.020567539567246797 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.42962962962962964, "acc_stderr": 0.030182099804387266, "acc_norm": 0.42962962962962964, "acc_norm_stderr": 0.030182099804387266 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8319327731092437, "acc_stderr": 0.024289102115692265, "acc_norm": 0.8319327731092437, "acc_norm_stderr": 0.024289102115692265 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.4768211920529801, "acc_stderr": 0.04078093859163083, "acc_norm": 0.4768211920529801, "acc_norm_stderr": 0.04078093859163083 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9100917431192661, "acc_stderr": 0.012264304540230435, "acc_norm": 0.9100917431192661, "acc_norm_stderr": 0.012264304540230435 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6296296296296297, "acc_stderr": 0.03293377139415191, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.03293377139415191 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8872549019607843, "acc_stderr": 0.02219857103945679, "acc_norm": 0.8872549019607843, "acc_norm_stderr": 0.02219857103945679 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8987341772151899, "acc_stderr": 0.019637720526065498, "acc_norm": 0.8987341772151899, "acc_norm_stderr": 0.019637720526065498 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7937219730941704, "acc_stderr": 0.02715715047956382, "acc_norm": 0.7937219730941704, "acc_norm_stderr": 0.02715715047956382 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8854961832061069, "acc_stderr": 0.027927473753597453, "acc_norm": 0.8854961832061069, "acc_norm_stderr": 0.027927473753597453 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8677685950413223, "acc_stderr": 0.0309227883204458, "acc_norm": 0.8677685950413223, "acc_norm_stderr": 0.0309227883204458 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8703703703703703, "acc_stderr": 0.032472243899179486, "acc_norm": 0.8703703703703703, "acc_norm_stderr": 0.032472243899179486 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8466257668711656, "acc_stderr": 0.028311601441438603, "acc_norm": 0.8466257668711656, "acc_norm_stderr": 0.028311601441438603 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489123, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.8640776699029126, "acc_stderr": 0.033932957297610096, "acc_norm": 0.8640776699029126, "acc_norm_stderr": 0.033932957297610096 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9230769230769231, "acc_stderr": 0.017456987872436193, "acc_norm": 0.9230769230769231, "acc_norm_stderr": 0.017456987872436193 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8888888888888888, "acc_stderr": 0.011238260831648321, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.011238260831648321 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8179190751445087, "acc_stderr": 0.02077676110251299, "acc_norm": 0.8179190751445087, "acc_norm_stderr": 0.02077676110251299 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.5027932960893855, "acc_stderr": 0.016722240595491714, "acc_norm": 0.5027932960893855, "acc_norm_stderr": 0.016722240595491714 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.8366013071895425, "acc_stderr": 0.02117062301121352, "acc_norm": 0.8366013071895425, "acc_norm_stderr": 0.02117062301121352 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8327974276527331, "acc_stderr": 0.021193872528034965, "acc_norm": 0.8327974276527331, "acc_norm_stderr": 0.021193872528034965 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8518518518518519, "acc_stderr": 0.01976645956359726, "acc_norm": 0.8518518518518519, "acc_norm_stderr": 0.01976645956359726 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.599290780141844, "acc_stderr": 0.029233465745573096, "acc_norm": 0.599290780141844, "acc_norm_stderr": 0.029233465745573096 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5749674054758801, "acc_stderr": 0.012625879884891989, "acc_norm": 0.5749674054758801, "acc_norm_stderr": 0.012625879884891989 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8125, "acc_stderr": 0.023709788253811766, "acc_norm": 0.8125, "acc_norm_stderr": 0.023709788253811766 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.8006535947712419, "acc_stderr": 0.016162402875061398, "acc_norm": 0.8006535947712419, "acc_norm_stderr": 0.016162402875061398 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8244897959183674, "acc_stderr": 0.024352800722970015, "acc_norm": 0.8244897959183674, "acc_norm_stderr": 0.024352800722970015 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8955223880597015, "acc_stderr": 0.021628920516700643, "acc_norm": 0.8955223880597015, "acc_norm_stderr": 0.021628920516700643 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.91, "acc_stderr": 0.028762349126466125, "acc_norm": 0.91, "acc_norm_stderr": 0.028762349126466125 }, "harness|hendrycksTest-virology|5": { "acc": 0.572289156626506, "acc_stderr": 0.038515976837185335, "acc_norm": 0.572289156626506, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8713450292397661, "acc_stderr": 0.025679342723276908, "acc_norm": 0.8713450292397661, "acc_norm_stderr": 0.025679342723276908 }, "harness|truthfulqa:mc|0": { "mc1": 0.3072215422276622, "mc1_stderr": 0.01615020132132301, "mc2": 0.4593002272815368, "mc2_stderr": 0.014606974103928553 }, "harness|winogrande|5": { "acc": 0.8145224940805051, "acc_stderr": 0.010923965303140505 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_TriadParty__deepmoney-34b-200k-base
[ "region:us" ]
2024-01-11T10:40:12+00:00
{"pretty_name": "Evaluation run of TriadParty/deepmoney-34b-200k-base", "dataset_summary": "Dataset automatically created during the evaluation run of model [TriadParty/deepmoney-34b-200k-base](https://huggingface.co/TriadParty/deepmoney-34b-200k-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TriadParty__deepmoney-34b-200k-base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T10:43:46.493782](https://huggingface.co/datasets/open-llm-leaderboard/details_TriadParty__deepmoney-34b-200k-base/blob/main/results_2024-01-11T10-43-46.493782.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7256546951493829,\n \"acc_stderr\": 0.028872391946942647,\n \"acc_norm\": 0.7403326146455504,\n \"acc_norm_stderr\": 0.029642883506415262,\n \"mc1\": 0.3072215422276622,\n \"mc1_stderr\": 0.01615020132132301,\n \"mc2\": 0.4593002272815368,\n \"mc2_stderr\": 0.014606974103928553\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.60580204778157,\n \"acc_stderr\": 0.014280522667467327,\n \"acc_norm\": 0.6399317406143344,\n \"acc_norm_stderr\": 0.014027516814585188\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6435968930491934,\n \"acc_stderr\": 0.00477957440277138,\n \"acc_norm\": 0.8386775542720574,\n \"acc_norm_stderr\": 0.0036707636737929607\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6962962962962963,\n \"acc_stderr\": 0.039725528847851375,\n \"acc_norm\": 0.6962962962962963,\n \"acc_norm_stderr\": 0.039725528847851375\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8355263157894737,\n \"acc_stderr\": 0.030167533468632726,\n \"acc_norm\": 0.8355263157894737,\n \"acc_norm_stderr\": 0.030167533468632726\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8188679245283019,\n \"acc_stderr\": 0.023702963526757798,\n \"acc_norm\": 0.8188679245283019,\n \"acc_norm_stderr\": 0.023702963526757798\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.028919802956134912,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.028919802956134912\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.034355680560478746,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.034355680560478746\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7617021276595745,\n \"acc_stderr\": 0.027851252973889774,\n \"acc_norm\": 0.7617021276595745,\n \"acc_norm_stderr\": 0.027851252973889774\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7379310344827587,\n \"acc_stderr\": 0.036646663372252565,\n \"acc_norm\": 0.7379310344827587,\n \"acc_norm_stderr\": 0.036646663372252565\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6005291005291006,\n \"acc_stderr\": 0.02522545028406793,\n \"acc_norm\": 0.6005291005291006,\n \"acc_norm_stderr\": 0.02522545028406793\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562427,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562427\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8935483870967742,\n \"acc_stderr\": 0.01754510295165663,\n \"acc_norm\": 0.8935483870967742,\n \"acc_norm_stderr\": 0.01754510295165663\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6403940886699507,\n \"acc_stderr\": 0.03376458246509568,\n \"acc_norm\": 0.6403940886699507,\n \"acc_norm_stderr\": 0.03376458246509568\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284357,\n \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284357\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9141414141414141,\n \"acc_stderr\": 0.01996022556317289,\n \"acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.01996022556317289\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909029,\n \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909029\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7923076923076923,\n \"acc_stderr\": 0.020567539567246797,\n \"acc_norm\": 0.7923076923076923,\n \"acc_norm_stderr\": 0.020567539567246797\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.030182099804387266,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.030182099804387266\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8319327731092437,\n \"acc_stderr\": 0.024289102115692265,\n \"acc_norm\": 0.8319327731092437,\n \"acc_norm_stderr\": 0.024289102115692265\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9100917431192661,\n \"acc_stderr\": 0.012264304540230435,\n \"acc_norm\": 0.9100917431192661,\n \"acc_norm_stderr\": 0.012264304540230435\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.03293377139415191,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.03293377139415191\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8872549019607843,\n \"acc_stderr\": 0.02219857103945679,\n \"acc_norm\": 0.8872549019607843,\n \"acc_norm_stderr\": 0.02219857103945679\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065498,\n \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065498\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597453,\n \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597453\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.0309227883204458,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.0309227883204458\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.032472243899179486,\n \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.032472243899179486\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8466257668711656,\n \"acc_stderr\": 0.028311601441438603,\n \"acc_norm\": 0.8466257668711656,\n \"acc_norm_stderr\": 0.028311601441438603\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.033932957297610096,\n \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.033932957297610096\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n \"acc_stderr\": 0.017456987872436193,\n \"acc_norm\": 0.9230769230769231,\n \"acc_norm_stderr\": 0.017456987872436193\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.011238260831648321,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.011238260831648321\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8179190751445087,\n \"acc_stderr\": 0.02077676110251299,\n \"acc_norm\": 0.8179190751445087,\n \"acc_norm_stderr\": 0.02077676110251299\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5027932960893855,\n \"acc_stderr\": 0.016722240595491714,\n \"acc_norm\": 0.5027932960893855,\n \"acc_norm_stderr\": 0.016722240595491714\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8366013071895425,\n \"acc_stderr\": 0.02117062301121352,\n \"acc_norm\": 0.8366013071895425,\n \"acc_norm_stderr\": 0.02117062301121352\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8327974276527331,\n \"acc_stderr\": 0.021193872528034965,\n \"acc_norm\": 0.8327974276527331,\n \"acc_norm_stderr\": 0.021193872528034965\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.01976645956359726,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.01976645956359726\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.599290780141844,\n \"acc_stderr\": 0.029233465745573096,\n \"acc_norm\": 0.599290780141844,\n \"acc_norm_stderr\": 0.029233465745573096\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5749674054758801,\n \"acc_stderr\": 0.012625879884891989,\n \"acc_norm\": 0.5749674054758801,\n \"acc_norm_stderr\": 0.012625879884891989\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8006535947712419,\n \"acc_stderr\": 0.016162402875061398,\n \"acc_norm\": 0.8006535947712419,\n \"acc_norm_stderr\": 0.016162402875061398\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8244897959183674,\n \"acc_stderr\": 0.024352800722970015,\n \"acc_norm\": 0.8244897959183674,\n \"acc_norm_stderr\": 0.024352800722970015\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276908,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276908\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3072215422276622,\n \"mc1_stderr\": 0.01615020132132301,\n \"mc2\": 0.4593002272815368,\n \"mc2_stderr\": 0.014606974103928553\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8145224940805051,\n \"acc_stderr\": 0.010923965303140505\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/TriadParty/deepmoney-34b-200k-base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|arc:challenge|25_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|gsm8k|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hellaswag|10_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T10-43-46.493782.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T10_43_46.493782", "path": ["**/details_harness|winogrande|5_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T10-43-46.493782.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T10_37_59.739390", "path": ["results_2024-01-11T10-37-59.739390.parquet"]}, {"split": "2024_01_11T10_43_46.493782", "path": ["results_2024-01-11T10-43-46.493782.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T10-43-46.493782.parquet"]}]}]}
2024-01-11T10:46:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TriadParty/deepmoney-34b-200k-base Dataset automatically created during the evaluation run of model TriadParty/deepmoney-34b-200k-base on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T10:43:46.493782(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of TriadParty/deepmoney-34b-200k-base\n\n\n\nDataset automatically created during the evaluation run of model TriadParty/deepmoney-34b-200k-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T10:43:46.493782(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TriadParty/deepmoney-34b-200k-base\n\n\n\nDataset automatically created during the evaluation run of model TriadParty/deepmoney-34b-200k-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T10:43:46.493782(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
1cd5a02b608ab2fe154f1692d04059e3a48ced1f
# Dataset of bena/ベナ/贝娜 (Arknights) This is the dataset of bena/ベナ/贝娜 (Arknights), containing 15 images and their tags. The core tags of this character are `bangs, short_hair, horns, red_eyes, blonde_hair, brown_hair, hairband`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 15 | 21.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bena_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 15 | 11.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bena_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 31 | 24.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bena_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 15 | 19.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bena_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 31 | 38.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bena_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/bena_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 15 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | short_sleeves, 1girl, solo, looking_at_viewer, black_choker, smile, dress, holding, shirt, shoes, sitting, open_mouth, simple_background, socks | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | short_sleeves | 1girl | solo | looking_at_viewer | black_choker | smile | dress | holding | shirt | shoes | sitting | open_mouth | simple_background | socks | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------|:--------|:-------|:--------------------|:---------------|:--------|:--------|:----------|:--------|:--------|:----------|:-------------|:--------------------|:--------| | 0 | 15 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/bena_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T10:40:51+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T10:44:40+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of bena/ベナ/贝娜 (Arknights) ================================= This is the dataset of bena/ベナ/贝娜 (Arknights), containing 15 images and their tags. The core tags of this character are 'bangs, short\_hair, horns, red\_eyes, blonde\_hair, brown\_hair, hairband', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
4599462853fa7e9c4ebccd68c1b997c3f0bfa6c1
# Dataset of bubble/バブル/泡泡 (Arknights) This is the dataset of bubble/バブル/泡泡 (Arknights), containing 24 images and their tags. The core tags of this character are `brown_hair, long_hair, horns, single_horn, animal_ears, horse_ears, ponytail, bow, bangs, green_bow, hair_ornament, green_eyes, hairclip, tail`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 24 | 26.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bubble_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 24 | 16.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bubble_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 51 | 31.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bubble_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 24 | 23.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bubble_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 51 | 43.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bubble_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/bubble_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------| | 0 | 24 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, smile, armor, gloves, looking_at_viewer, open_mouth, blush, white_background | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | armor | gloves | looking_at_viewer | open_mouth | blush | white_background | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------|:---------|:--------------------|:-------------|:--------|:-------------------| | 0 | 24 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X |
CyberHarem/bubble_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T10:40:52+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T10:45:38+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of bubble/バブル/泡泡 (Arknights) ==================================== This is the dataset of bubble/バブル/泡泡 (Arknights), containing 24 images and their tags. The core tags of this character are 'brown\_hair, long\_hair, horns, single\_horn, animal\_ears, horse\_ears, ponytail, bow, bangs, green\_bow, hair\_ornament, green\_eyes, hairclip, tail', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
f795386e36e4554a2b9f2ff13f1a49722659e108
# Dataset of leto/烈夏 (Arknights) This is the dataset of leto/烈夏 (Arknights), containing 15 images and their tags. The core tags of this character are `animal_ears, bear_ears, multicolored_hair, streaked_hair, bangs, red_eyes, hair_ornament, short_hair, white_hair, black_hair, x_hair_ornament, bear_girl, brown_hair, breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 15 | 31.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leto_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 15 | 13.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leto_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 39 | 30.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leto_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 15 | 24.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leto_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 39 | 53.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leto_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/leto_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 15 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, long_sleeves, pleated_skirt, black_jacket, white_shirt, open_jacket, smile, looking_at_viewer, midriff, miniskirt, navel, black_skirt, open_mouth, sailor_collar, scarf, black_gloves, fingerless_gloves, red_neckerchief, simple_background, stomach, suspenders, cowboy_shot, red_thighhighs, serafuku, standing, blue_skirt, crop_top_overhang, white_background, zettai_ryouiki | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | long_sleeves | pleated_skirt | black_jacket | white_shirt | open_jacket | smile | looking_at_viewer | midriff | miniskirt | navel | black_skirt | open_mouth | sailor_collar | scarf | black_gloves | fingerless_gloves | red_neckerchief | simple_background | stomach | suspenders | cowboy_shot | red_thighhighs | serafuku | standing | blue_skirt | crop_top_overhang | white_background | zettai_ryouiki | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:----------------|:---------------|:--------------|:--------------|:--------|:--------------------|:----------|:------------|:--------|:--------------|:-------------|:----------------|:--------|:---------------|:--------------------|:------------------|:--------------------|:----------|:-------------|:--------------|:-----------------|:-----------|:-----------|:-------------|:--------------------|:-------------------|:-----------------| | 0 | 15 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/leto_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T10:40:59+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T11:00:13+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of leto/烈夏 (Arknights) ============================== This is the dataset of leto/烈夏 (Arknights), containing 15 images and their tags. The core tags of this character are 'animal\_ears, bear\_ears, multicolored\_hair, streaked\_hair, bangs, red\_eyes, hair\_ornament, short\_hair, white\_hair, black\_hair, x\_hair\_ornament, bear\_girl, brown\_hair, breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
e9452574063114ba40a5c0227f76e4b7c65c5313
# Dataset of spuria/スプリア/空构 (Arknights) This is the dataset of spuria/スプリア/空构 (Arknights), containing 25 images and their tags. The core tags of this character are `halo, short_hair, blue_hair, bangs, breasts, earrings`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 25 | 36.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spuria_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 25 | 17.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spuria_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 60 | 36.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spuria_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 25 | 30.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spuria_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 60 | 57.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spuria_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/spuria_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, jacket, looking_at_viewer, solo, collarbone, long_sleeves, white_gloves, upper_body, white_background, closed_mouth, grey_eyes, grey_gloves, cuffs, grin, holding, jewelry, pants, simple_background | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | jacket | looking_at_viewer | solo | collarbone | long_sleeves | white_gloves | upper_body | white_background | closed_mouth | grey_eyes | grey_gloves | cuffs | grin | holding | jewelry | pants | simple_background | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:--------------------|:-------|:-------------|:---------------|:---------------|:-------------|:-------------------|:---------------|:------------|:--------------|:--------|:-------|:----------|:----------|:--------|:--------------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/spuria_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T10:41:03+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T10:49:03+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of spuria/スプリア/空构 (Arknights) ===================================== This is the dataset of spuria/スプリア/空构 (Arknights), containing 25 images and their tags. The core tags of this character are 'halo, short\_hair, blue\_hair, bangs, breasts, earrings', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
95ba64dfee5556e3ee424ca3d77eead40c0fe4a6
# Dataset Card for Evaluation run of elyza/ELYZA-japanese-Llama-2-13b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [elyza/ELYZA-japanese-Llama-2-13b](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T10:41:02.559560](https://huggingface.co/datasets/open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-13b/blob/main/results_2024-01-11T10-41-02.559560.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5439224806952777, "acc_stderr": 0.03385664727107593, "acc_norm": 0.5487504479677631, "acc_norm_stderr": 0.03457750439538936, "mc1": 0.2521419828641371, "mc1_stderr": 0.015201522246299953, "mc2": 0.40426755196558856, "mc2_stderr": 0.014274265427871576 }, "harness|arc:challenge|25": { "acc": 0.5341296928327645, "acc_stderr": 0.0145773113152311, "acc_norm": 0.5699658703071673, "acc_norm_stderr": 0.014467631559137994 }, "harness|hellaswag|10": { "acc": 0.6060545708026289, "acc_stderr": 0.004876243842318606, "acc_norm": 0.8089026090420235, "acc_norm_stderr": 0.003923620666711541 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5185185185185185, "acc_stderr": 0.04316378599511326, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.04316378599511326 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5723684210526315, "acc_stderr": 0.040260970832965634, "acc_norm": 0.5723684210526315, "acc_norm_stderr": 0.040260970832965634 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6037735849056604, "acc_stderr": 0.030102793781791197, "acc_norm": 0.6037735849056604, "acc_norm_stderr": 0.030102793781791197 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5694444444444444, "acc_stderr": 0.04140685639111503, "acc_norm": 0.5694444444444444, "acc_norm_stderr": 0.04140685639111503 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.04999999999999999, "acc_norm": 0.45, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4913294797687861, "acc_stderr": 0.03811890988940412, "acc_norm": 0.4913294797687861, "acc_norm_stderr": 0.03811890988940412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04690650298201943, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04690650298201943 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4340425531914894, "acc_stderr": 0.032400380867927465, "acc_norm": 0.4340425531914894, "acc_norm_stderr": 0.032400380867927465 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2982456140350877, "acc_stderr": 0.043036840335373146, "acc_norm": 0.2982456140350877, "acc_norm_stderr": 0.043036840335373146 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.36507936507936506, "acc_stderr": 0.024796060602699958, "acc_norm": 0.36507936507936506, "acc_norm_stderr": 0.024796060602699958 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2619047619047619, "acc_stderr": 0.03932537680392871, "acc_norm": 0.2619047619047619, "acc_norm_stderr": 0.03932537680392871 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.667741935483871, "acc_stderr": 0.02679556084812281, "acc_norm": 0.667741935483871, "acc_norm_stderr": 0.02679556084812281 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4187192118226601, "acc_stderr": 0.03471192860518468, "acc_norm": 0.4187192118226601, "acc_norm_stderr": 0.03471192860518468 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6242424242424243, "acc_stderr": 0.037818873532059816, "acc_norm": 0.6242424242424243, "acc_norm_stderr": 0.037818873532059816 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6616161616161617, "acc_stderr": 0.03371124142626302, "acc_norm": 0.6616161616161617, "acc_norm_stderr": 0.03371124142626302 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.772020725388601, "acc_stderr": 0.030276909945178263, "acc_norm": 0.772020725388601, "acc_norm_stderr": 0.030276909945178263 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4641025641025641, "acc_stderr": 0.025285585990017848, "acc_norm": 0.4641025641025641, "acc_norm_stderr": 0.025285585990017848 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3, "acc_stderr": 0.027940457136228402, "acc_norm": 0.3, "acc_norm_stderr": 0.027940457136228402 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.542016806722689, "acc_stderr": 0.03236361111951941, "acc_norm": 0.542016806722689, "acc_norm_stderr": 0.03236361111951941 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.728440366972477, "acc_stderr": 0.019069098363191428, "acc_norm": 0.728440366972477, "acc_norm_stderr": 0.019069098363191428 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4212962962962963, "acc_stderr": 0.03367462138896078, "acc_norm": 0.4212962962962963, "acc_norm_stderr": 0.03367462138896078 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6764705882352942, "acc_stderr": 0.032834720561085606, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.032834720561085606 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7341772151898734, "acc_stderr": 0.028756799629658335, "acc_norm": 0.7341772151898734, "acc_norm_stderr": 0.028756799629658335 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6278026905829597, "acc_stderr": 0.03244305283008731, "acc_norm": 0.6278026905829597, "acc_norm_stderr": 0.03244305283008731 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6335877862595419, "acc_stderr": 0.04225875451969637, "acc_norm": 0.6335877862595419, "acc_norm_stderr": 0.04225875451969637 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7272727272727273, "acc_stderr": 0.040655781409087044, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.040655781409087044 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6748466257668712, "acc_stderr": 0.03680350371286461, "acc_norm": 0.6748466257668712, "acc_norm_stderr": 0.03680350371286461 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.6990291262135923, "acc_stderr": 0.04541609446503948, "acc_norm": 0.6990291262135923, "acc_norm_stderr": 0.04541609446503948 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8076923076923077, "acc_stderr": 0.02581923325648373, "acc_norm": 0.8076923076923077, "acc_norm_stderr": 0.02581923325648373 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7484035759897829, "acc_stderr": 0.015517322365529638, "acc_norm": 0.7484035759897829, "acc_norm_stderr": 0.015517322365529638 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5924855491329479, "acc_stderr": 0.026454578146931505, "acc_norm": 0.5924855491329479, "acc_norm_stderr": 0.026454578146931505 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.25251396648044694, "acc_stderr": 0.014530330201468641, "acc_norm": 0.25251396648044694, "acc_norm_stderr": 0.014530330201468641 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5980392156862745, "acc_stderr": 0.028074158947600653, "acc_norm": 0.5980392156862745, "acc_norm_stderr": 0.028074158947600653 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6045016077170418, "acc_stderr": 0.02777091853142784, "acc_norm": 0.6045016077170418, "acc_norm_stderr": 0.02777091853142784 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6296296296296297, "acc_stderr": 0.026869490744815247, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.026869490744815247 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3829787234042553, "acc_stderr": 0.02899908090480618, "acc_norm": 0.3829787234042553, "acc_norm_stderr": 0.02899908090480618 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3852672750977836, "acc_stderr": 0.012429485434955192, "acc_norm": 0.3852672750977836, "acc_norm_stderr": 0.012429485434955192 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4742647058823529, "acc_stderr": 0.03033257809455504, "acc_norm": 0.4742647058823529, "acc_norm_stderr": 0.03033257809455504 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5245098039215687, "acc_stderr": 0.020203517280261433, "acc_norm": 0.5245098039215687, "acc_norm_stderr": 0.020203517280261433 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6163265306122448, "acc_stderr": 0.031130880396235933, "acc_norm": 0.6163265306122448, "acc_norm_stderr": 0.031130880396235933 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7114427860696517, "acc_stderr": 0.03203841040213321, "acc_norm": 0.7114427860696517, "acc_norm_stderr": 0.03203841040213321 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.77, "acc_stderr": 0.04229525846816506, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-virology|5": { "acc": 0.4457831325301205, "acc_stderr": 0.03869543323472101, "acc_norm": 0.4457831325301205, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7660818713450293, "acc_stderr": 0.032467217651178264, "acc_norm": 0.7660818713450293, "acc_norm_stderr": 0.032467217651178264 }, "harness|truthfulqa:mc|0": { "mc1": 0.2521419828641371, "mc1_stderr": 0.015201522246299953, "mc2": 0.40426755196558856, "mc2_stderr": 0.014274265427871576 }, "harness|winogrande|5": { "acc": 0.7687450670876085, "acc_stderr": 0.011850040124850508 }, "harness|gsm8k|5": { "acc": 0.27293404094010615, "acc_stderr": 0.012270381151108749 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-13b
[ "region:us" ]
2024-01-11T10:43:37+00:00
{"pretty_name": "Evaluation run of elyza/ELYZA-japanese-Llama-2-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [elyza/ELYZA-japanese-Llama-2-13b](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T10:41:02.559560](https://huggingface.co/datasets/open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-13b/blob/main/results_2024-01-11T10-41-02.559560.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5439224806952777,\n \"acc_stderr\": 0.03385664727107593,\n \"acc_norm\": 0.5487504479677631,\n \"acc_norm_stderr\": 0.03457750439538936,\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299953,\n \"mc2\": 0.40426755196558856,\n \"mc2_stderr\": 0.014274265427871576\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5341296928327645,\n \"acc_stderr\": 0.0145773113152311,\n \"acc_norm\": 0.5699658703071673,\n \"acc_norm_stderr\": 0.014467631559137994\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6060545708026289,\n \"acc_stderr\": 0.004876243842318606,\n \"acc_norm\": 0.8089026090420235,\n \"acc_norm_stderr\": 0.003923620666711541\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.04316378599511326,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.04316378599511326\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.040260970832965634,\n \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.040260970832965634\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.032400380867927465,\n \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.032400380867927465\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.043036840335373146,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.043036840335373146\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.024796060602699958,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.024796060602699958\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.667741935483871,\n \"acc_stderr\": 0.02679556084812281,\n \"acc_norm\": 0.667741935483871,\n \"acc_norm_stderr\": 0.02679556084812281\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.037818873532059816,\n \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.037818873532059816\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6616161616161617,\n \"acc_stderr\": 0.03371124142626302,\n \"acc_norm\": 0.6616161616161617,\n \"acc_norm_stderr\": 0.03371124142626302\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178263,\n \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178263\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.025285585990017848,\n \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.025285585990017848\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.542016806722689,\n \"acc_stderr\": 0.03236361111951941,\n \"acc_norm\": 0.542016806722689,\n \"acc_norm_stderr\": 0.03236361111951941\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.728440366972477,\n \"acc_stderr\": 0.019069098363191428,\n \"acc_norm\": 0.728440366972477,\n \"acc_norm_stderr\": 0.019069098363191428\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.032834720561085606,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.032834720561085606\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.028756799629658335,\n \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.028756799629658335\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.040655781409087044,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.040655781409087044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.02581923325648373,\n \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.02581923325648373\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7484035759897829,\n \"acc_stderr\": 0.015517322365529638,\n \"acc_norm\": 0.7484035759897829,\n \"acc_norm_stderr\": 0.015517322365529638\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5924855491329479,\n \"acc_stderr\": 0.026454578146931505,\n \"acc_norm\": 0.5924855491329479,\n \"acc_norm_stderr\": 0.026454578146931505\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n \"acc_stderr\": 0.014530330201468641,\n \"acc_norm\": 0.25251396648044694,\n \"acc_norm_stderr\": 0.014530330201468641\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.028074158947600653,\n \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.028074158947600653\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n \"acc_stderr\": 0.02777091853142784,\n \"acc_norm\": 0.6045016077170418,\n \"acc_norm_stderr\": 0.02777091853142784\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.026869490744815247,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.026869490744815247\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3852672750977836,\n \"acc_stderr\": 0.012429485434955192,\n \"acc_norm\": 0.3852672750977836,\n \"acc_norm_stderr\": 0.012429485434955192\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.03033257809455504,\n \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.03033257809455504\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5245098039215687,\n \"acc_stderr\": 0.020203517280261433,\n \"acc_norm\": 0.5245098039215687,\n \"acc_norm_stderr\": 0.020203517280261433\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235933,\n \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235933\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.7114427860696517,\n \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.032467217651178264,\n \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.032467217651178264\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299953,\n \"mc2\": 0.40426755196558856,\n \"mc2_stderr\": 0.014274265427871576\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.011850040124850508\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.27293404094010615,\n \"acc_stderr\": 0.012270381151108749\n }\n}\n```", "repo_url": "https://huggingface.co/elyza/ELYZA-japanese-Llama-2-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|arc:challenge|25_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|gsm8k|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hellaswag|10_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T10-41-02.559560.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["**/details_harness|winogrande|5_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T10-41-02.559560.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T10_41_02.559560", "path": ["results_2024-01-11T10-41-02.559560.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T10-41-02.559560.parquet"]}]}]}
2024-01-11T10:43:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of elyza/ELYZA-japanese-Llama-2-13b Dataset automatically created during the evaluation run of model elyza/ELYZA-japanese-Llama-2-13b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T10:41:02.559560(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of elyza/ELYZA-japanese-Llama-2-13b\n\n\n\nDataset automatically created during the evaluation run of model elyza/ELYZA-japanese-Llama-2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T10:41:02.559560(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of elyza/ELYZA-japanese-Llama-2-13b\n\n\n\nDataset automatically created during the evaluation run of model elyza/ELYZA-japanese-Llama-2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T10:41:02.559560(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
ede17d0c6cd6cc0b240b07e7cf64c32a8e6cfeb5
# Dataset Card for "style_transfer_paintings_dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
VaggP/style_transfer_paintings_dataset
[ "region:us" ]
2024-01-11T10:44:15+00:00
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "generated", "1": "original"}}}}], "splits": [{"name": "train", "num_bytes": 6904526021.588, "num_examples": 4913}, {"name": "test", "num_bytes": 2137893838.395, "num_examples": 1235}], "download_size": 10900941346, "dataset_size": 9042419859.983}}
2024-01-11T13:11:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for "style_transfer_paintings_dataset" More Information needed
[ "# Dataset Card for \"style_transfer_paintings_dataset\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"style_transfer_paintings_dataset\"\n\nMore Information needed" ]
e3538ef362f3974052649fc65b28d9702aaff569
# Dataset Card for Evaluation run of elyza/ELYZA-japanese-Llama-2-13b-instruct <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [elyza/ELYZA-japanese-Llama-2-13b-instruct](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-13b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-13b-instruct", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T10:45:22.609488](https://huggingface.co/datasets/open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-13b-instruct/blob/main/results_2024-01-11T10-45-22.609488.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.553812579681834, "acc_stderr": 0.03374918001946147, "acc_norm": 0.5614408486544806, "acc_norm_stderr": 0.034503449536591596, "mc1": 0.26805385556915545, "mc1_stderr": 0.015506204722834562, "mc2": 0.42399968384912, "mc2_stderr": 0.015234950600007386 }, "harness|arc:challenge|25": { "acc": 0.5366894197952219, "acc_stderr": 0.01457200052775699, "acc_norm": 0.5836177474402731, "acc_norm_stderr": 0.01440561827943617 }, "harness|hellaswag|10": { "acc": 0.629555865365465, "acc_stderr": 0.004819367172685965, "acc_norm": 0.8220474009161521, "acc_norm_stderr": 0.00381691171167917 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5185185185185185, "acc_stderr": 0.043163785995113245, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.043163785995113245 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5723684210526315, "acc_stderr": 0.04026097083296564, "acc_norm": 0.5723684210526315, "acc_norm_stderr": 0.04026097083296564 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.55, "acc_stderr": 0.04999999999999999, "acc_norm": 0.55, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5962264150943396, "acc_stderr": 0.03019761160019795, "acc_norm": 0.5962264150943396, "acc_norm_stderr": 0.03019761160019795 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5833333333333334, "acc_stderr": 0.04122728707651282, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.04122728707651282 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5375722543352601, "acc_stderr": 0.0380168510452446, "acc_norm": 0.5375722543352601, "acc_norm_stderr": 0.0380168510452446 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.35294117647058826, "acc_stderr": 0.04755129616062946, "acc_norm": 0.35294117647058826, "acc_norm_stderr": 0.04755129616062946 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.41702127659574467, "acc_stderr": 0.03223276266711712, "acc_norm": 0.41702127659574467, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2543859649122807, "acc_stderr": 0.040969851398436716, "acc_norm": 0.2543859649122807, "acc_norm_stderr": 0.040969851398436716 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5172413793103449, "acc_stderr": 0.04164188720169375, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.04164188720169375 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.373015873015873, "acc_stderr": 0.02490699045899257, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.02490699045899257 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3968253968253968, "acc_stderr": 0.04375888492727061, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.04375888492727061 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6741935483870968, "acc_stderr": 0.026662010578567104, "acc_norm": 0.6741935483870968, "acc_norm_stderr": 0.026662010578567104 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4729064039408867, "acc_stderr": 0.03512819077876106, "acc_norm": 0.4729064039408867, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6484848484848484, "acc_stderr": 0.037282069986826503, "acc_norm": 0.6484848484848484, "acc_norm_stderr": 0.037282069986826503 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6868686868686869, "acc_stderr": 0.033042050878136525, "acc_norm": 0.6868686868686869, "acc_norm_stderr": 0.033042050878136525 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7772020725388601, "acc_stderr": 0.030031147977641538, "acc_norm": 0.7772020725388601, "acc_norm_stderr": 0.030031147977641538 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5256410256410257, "acc_stderr": 0.02531764972644866, "acc_norm": 0.5256410256410257, "acc_norm_stderr": 0.02531764972644866 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.29259259259259257, "acc_stderr": 0.02773896963217609, "acc_norm": 0.29259259259259257, "acc_norm_stderr": 0.02773896963217609 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5798319327731093, "acc_stderr": 0.03206183783236153, "acc_norm": 0.5798319327731093, "acc_norm_stderr": 0.03206183783236153 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31125827814569534, "acc_stderr": 0.03780445850526733, "acc_norm": 0.31125827814569534, "acc_norm_stderr": 0.03780445850526733 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7559633027522936, "acc_stderr": 0.018415286351416406, "acc_norm": 0.7559633027522936, "acc_norm_stderr": 0.018415286351416406 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.48148148148148145, "acc_stderr": 0.03407632093854052, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.03407632093854052 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7009803921568627, "acc_stderr": 0.03213325717373615, "acc_norm": 0.7009803921568627, "acc_norm_stderr": 0.03213325717373615 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7172995780590717, "acc_stderr": 0.02931281415395593, "acc_norm": 0.7172995780590717, "acc_norm_stderr": 0.02931281415395593 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6322869955156951, "acc_stderr": 0.03236198350928275, "acc_norm": 0.6322869955156951, "acc_norm_stderr": 0.03236198350928275 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6335877862595419, "acc_stderr": 0.042258754519696365, "acc_norm": 0.6335877862595419, "acc_norm_stderr": 0.042258754519696365 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070417, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6851851851851852, "acc_stderr": 0.04489931073591312, "acc_norm": 0.6851851851851852, "acc_norm_stderr": 0.04489931073591312 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6748466257668712, "acc_stderr": 0.036803503712864616, "acc_norm": 0.6748466257668712, "acc_norm_stderr": 0.036803503712864616 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.32142857142857145, "acc_stderr": 0.04432804055291519, "acc_norm": 0.32142857142857145, "acc_norm_stderr": 0.04432804055291519 }, "harness|hendrycksTest-management|5": { "acc": 0.7378640776699029, "acc_stderr": 0.043546310772605956, "acc_norm": 0.7378640776699029, "acc_norm_stderr": 0.043546310772605956 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8205128205128205, "acc_stderr": 0.02514093595033544, "acc_norm": 0.8205128205128205, "acc_norm_stderr": 0.02514093595033544 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001974, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7624521072796935, "acc_stderr": 0.015218733046150193, "acc_norm": 0.7624521072796935, "acc_norm_stderr": 0.015218733046150193 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6098265895953757, "acc_stderr": 0.026261677607806642, "acc_norm": 0.6098265895953757, "acc_norm_stderr": 0.026261677607806642 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2905027932960894, "acc_stderr": 0.015183844307206143, "acc_norm": 0.2905027932960894, "acc_norm_stderr": 0.015183844307206143 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6339869281045751, "acc_stderr": 0.027582811415159614, "acc_norm": 0.6339869281045751, "acc_norm_stderr": 0.027582811415159614 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6302250803858521, "acc_stderr": 0.027417996705630995, "acc_norm": 0.6302250803858521, "acc_norm_stderr": 0.027417996705630995 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5895061728395061, "acc_stderr": 0.027371350925124764, "acc_norm": 0.5895061728395061, "acc_norm_stderr": 0.027371350925124764 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3900709219858156, "acc_stderr": 0.029097675599463926, "acc_norm": 0.3900709219858156, "acc_norm_stderr": 0.029097675599463926 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3983050847457627, "acc_stderr": 0.012503310565166244, "acc_norm": 0.3983050847457627, "acc_norm_stderr": 0.012503310565166244 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5551470588235294, "acc_stderr": 0.03018753206032938, "acc_norm": 0.5551470588235294, "acc_norm_stderr": 0.03018753206032938 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5326797385620915, "acc_stderr": 0.020184583359102202, "acc_norm": 0.5326797385620915, "acc_norm_stderr": 0.020184583359102202 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6285714285714286, "acc_stderr": 0.030932858792789855, "acc_norm": 0.6285714285714286, "acc_norm_stderr": 0.030932858792789855 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7313432835820896, "acc_stderr": 0.03134328358208954, "acc_norm": 0.7313432835820896, "acc_norm_stderr": 0.03134328358208954 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.4759036144578313, "acc_stderr": 0.03887971849597264, "acc_norm": 0.4759036144578313, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7660818713450293, "acc_stderr": 0.032467217651178264, "acc_norm": 0.7660818713450293, "acc_norm_stderr": 0.032467217651178264 }, "harness|truthfulqa:mc|0": { "mc1": 0.26805385556915545, "mc1_stderr": 0.015506204722834562, "mc2": 0.42399968384912, "mc2_stderr": 0.015234950600007386 }, "harness|winogrande|5": { "acc": 0.7521704814522494, "acc_stderr": 0.012134386019865346 }, "harness|gsm8k|5": { "acc": 0.14480667172100076, "acc_stderr": 0.009693234799052713 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-13b-instruct
[ "region:us" ]
2024-01-11T10:47:47+00:00
{"pretty_name": "Evaluation run of elyza/ELYZA-japanese-Llama-2-13b-instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [elyza/ELYZA-japanese-Llama-2-13b-instruct](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-13b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-13b-instruct\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T10:45:22.609488](https://huggingface.co/datasets/open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-13b-instruct/blob/main/results_2024-01-11T10-45-22.609488.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.553812579681834,\n \"acc_stderr\": 0.03374918001946147,\n \"acc_norm\": 0.5614408486544806,\n \"acc_norm_stderr\": 0.034503449536591596,\n \"mc1\": 0.26805385556915545,\n \"mc1_stderr\": 0.015506204722834562,\n \"mc2\": 0.42399968384912,\n \"mc2_stderr\": 0.015234950600007386\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5366894197952219,\n \"acc_stderr\": 0.01457200052775699,\n \"acc_norm\": 0.5836177474402731,\n \"acc_norm_stderr\": 0.01440561827943617\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.629555865365465,\n \"acc_stderr\": 0.004819367172685965,\n \"acc_norm\": 0.8220474009161521,\n \"acc_norm_stderr\": 0.00381691171167917\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296564,\n \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296564\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n \"acc_stderr\": 0.026662010578567104,\n \"acc_norm\": 0.6741935483870968,\n \"acc_norm_stderr\": 0.026662010578567104\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.030031147977641538,\n \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.030031147977641538\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5256410256410257,\n \"acc_stderr\": 0.02531764972644866,\n \"acc_norm\": 0.5256410256410257,\n \"acc_norm_stderr\": 0.02531764972644866\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236153,\n \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236153\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7559633027522936,\n \"acc_stderr\": 0.018415286351416406,\n \"acc_norm\": 0.7559633027522936,\n \"acc_norm_stderr\": 0.018415286351416406\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373615,\n \"acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373615\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395593,\n \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395593\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.042258754519696365,\n \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.042258754519696365\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n \"acc_stderr\": 0.02514093595033544,\n \"acc_norm\": 0.8205128205128205,\n \"acc_norm_stderr\": 0.02514093595033544\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7624521072796935,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.7624521072796935,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.026261677607806642,\n \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.026261677607806642\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2905027932960894,\n \"acc_stderr\": 0.015183844307206143,\n \"acc_norm\": 0.2905027932960894,\n \"acc_norm_stderr\": 0.015183844307206143\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159614,\n \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159614\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n \"acc_stderr\": 0.027417996705630995,\n \"acc_norm\": 0.6302250803858521,\n \"acc_norm_stderr\": 0.027417996705630995\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5895061728395061,\n \"acc_stderr\": 0.027371350925124764,\n \"acc_norm\": 0.5895061728395061,\n \"acc_norm_stderr\": 0.027371350925124764\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3900709219858156,\n \"acc_stderr\": 0.029097675599463926,\n \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.029097675599463926\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3983050847457627,\n \"acc_stderr\": 0.012503310565166244,\n \"acc_norm\": 0.3983050847457627,\n \"acc_norm_stderr\": 0.012503310565166244\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.03018753206032938,\n \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.03018753206032938\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5326797385620915,\n \"acc_stderr\": 0.020184583359102202,\n \"acc_norm\": 0.5326797385620915,\n \"acc_norm_stderr\": 0.020184583359102202\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789855,\n \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789855\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.032467217651178264,\n \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.032467217651178264\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26805385556915545,\n \"mc1_stderr\": 0.015506204722834562,\n \"mc2\": 0.42399968384912,\n \"mc2_stderr\": 0.015234950600007386\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.012134386019865346\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14480667172100076,\n \"acc_stderr\": 0.009693234799052713\n }\n}\n```", "repo_url": "https://huggingface.co/elyza/ELYZA-japanese-Llama-2-13b-instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|arc:challenge|25_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|gsm8k|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hellaswag|10_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T10-45-22.609488.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["**/details_harness|winogrande|5_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T10-45-22.609488.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T10_45_22.609488", "path": ["results_2024-01-11T10-45-22.609488.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T10-45-22.609488.parquet"]}]}]}
2024-01-11T10:48:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of elyza/ELYZA-japanese-Llama-2-13b-instruct Dataset automatically created during the evaluation run of model elyza/ELYZA-japanese-Llama-2-13b-instruct on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T10:45:22.609488(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of elyza/ELYZA-japanese-Llama-2-13b-instruct\n\n\n\nDataset automatically created during the evaluation run of model elyza/ELYZA-japanese-Llama-2-13b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T10:45:22.609488(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of elyza/ELYZA-japanese-Llama-2-13b-instruct\n\n\n\nDataset automatically created during the evaluation run of model elyza/ELYZA-japanese-Llama-2-13b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T10:45:22.609488(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
b225bfeef82abf5871b20ef9475493cdb3315c9f
# Dataset Card for Evaluation run of elyza/ELYZA-japanese-Llama-2-13b-fast <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [elyza/ELYZA-japanese-Llama-2-13b-fast](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-13b-fast) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-13b-fast", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T10:56:54.496975](https://huggingface.co/datasets/open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-13b-fast/blob/main/results_2024-01-11T10-56-54.496975.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5436158357444296, "acc_stderr": 0.03390716973340485, "acc_norm": 0.5486678144243484, "acc_norm_stderr": 0.03463583392217699, "mc1": 0.24969400244798043, "mc1_stderr": 0.015152286907148128, "mc2": 0.4031296675879943, "mc2_stderr": 0.014303460791454664 }, "harness|arc:challenge|25": { "acc": 0.5247440273037542, "acc_stderr": 0.01459348769493774, "acc_norm": 0.5588737201365188, "acc_norm_stderr": 0.014509747749064663 }, "harness|hellaswag|10": { "acc": 0.603963353913563, "acc_stderr": 0.004880726787988636, "acc_norm": 0.8073093009360686, "acc_norm_stderr": 0.00393606145515111 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4962962962962963, "acc_stderr": 0.04319223625811331, "acc_norm": 0.4962962962962963, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5263157894736842, "acc_stderr": 0.040633027314866704, "acc_norm": 0.5263157894736842, "acc_norm_stderr": 0.040633027314866704 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5886792452830188, "acc_stderr": 0.030285009259009787, "acc_norm": 0.5886792452830188, "acc_norm_stderr": 0.030285009259009787 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5972222222222222, "acc_stderr": 0.04101405519842426, "acc_norm": 0.5972222222222222, "acc_norm_stderr": 0.04101405519842426 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.49, "acc_stderr": 0.05024183937956913, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956913 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4682080924855491, "acc_stderr": 0.03804749744364763, "acc_norm": 0.4682080924855491, "acc_norm_stderr": 0.03804749744364763 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.35294117647058826, "acc_stderr": 0.04755129616062946, "acc_norm": 0.35294117647058826, "acc_norm_stderr": 0.04755129616062946 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.67, "acc_stderr": 0.04725815626252607, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252607 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.41702127659574467, "acc_stderr": 0.03223276266711712, "acc_norm": 0.41702127659574467, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2631578947368421, "acc_stderr": 0.041424397194893624, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.041424397194893624 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192118, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192118 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3544973544973545, "acc_stderr": 0.024636830602842, "acc_norm": 0.3544973544973545, "acc_norm_stderr": 0.024636830602842 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.29365079365079366, "acc_stderr": 0.04073524322147126, "acc_norm": 0.29365079365079366, "acc_norm_stderr": 0.04073524322147126 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6419354838709678, "acc_stderr": 0.027273890594300645, "acc_norm": 0.6419354838709678, "acc_norm_stderr": 0.027273890594300645 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.43349753694581283, "acc_stderr": 0.03486731727419872, "acc_norm": 0.43349753694581283, "acc_norm_stderr": 0.03486731727419872 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6303030303030303, "acc_stderr": 0.03769430314512568, "acc_norm": 0.6303030303030303, "acc_norm_stderr": 0.03769430314512568 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6868686868686869, "acc_stderr": 0.033042050878136525, "acc_norm": 0.6868686868686869, "acc_norm_stderr": 0.033042050878136525 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7823834196891192, "acc_stderr": 0.02977866303775296, "acc_norm": 0.7823834196891192, "acc_norm_stderr": 0.02977866303775296 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4897435897435897, "acc_stderr": 0.025345672221942374, "acc_norm": 0.4897435897435897, "acc_norm_stderr": 0.025345672221942374 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.28888888888888886, "acc_stderr": 0.02763490726417854, "acc_norm": 0.28888888888888886, "acc_norm_stderr": 0.02763490726417854 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5462184873949579, "acc_stderr": 0.03233943468182088, "acc_norm": 0.5462184873949579, "acc_norm_stderr": 0.03233943468182088 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.710091743119266, "acc_stderr": 0.019453066609201597, "acc_norm": 0.710091743119266, "acc_norm_stderr": 0.019453066609201597 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4583333333333333, "acc_stderr": 0.03398110890294636, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.696078431372549, "acc_stderr": 0.032282103870378935, "acc_norm": 0.696078431372549, "acc_norm_stderr": 0.032282103870378935 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7341772151898734, "acc_stderr": 0.028756799629658335, "acc_norm": 0.7341772151898734, "acc_norm_stderr": 0.028756799629658335 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6367713004484304, "acc_stderr": 0.032277904428505, "acc_norm": 0.6367713004484304, "acc_norm_stderr": 0.032277904428505 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6183206106870229, "acc_stderr": 0.0426073515764456, "acc_norm": 0.6183206106870229, "acc_norm_stderr": 0.0426073515764456 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7272727272727273, "acc_stderr": 0.040655781409087044, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.040655781409087044 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7037037037037037, "acc_stderr": 0.04414343666854933, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.04414343666854933 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6380368098159509, "acc_stderr": 0.037757007291414416, "acc_norm": 0.6380368098159509, "acc_norm_stderr": 0.037757007291414416 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.35714285714285715, "acc_stderr": 0.04547960999764376, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.04547960999764376 }, "harness|hendrycksTest-management|5": { "acc": 0.6699029126213593, "acc_stderr": 0.0465614711001235, "acc_norm": 0.6699029126213593, "acc_norm_stderr": 0.0465614711001235 }, "harness|hendrycksTest-marketing|5": { "acc": 0.811965811965812, "acc_stderr": 0.025598193686652237, "acc_norm": 0.811965811965812, "acc_norm_stderr": 0.025598193686652237 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7458492975734355, "acc_stderr": 0.015569254692045752, "acc_norm": 0.7458492975734355, "acc_norm_stderr": 0.015569254692045752 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6242774566473989, "acc_stderr": 0.02607431485165708, "acc_norm": 0.6242774566473989, "acc_norm_stderr": 0.02607431485165708 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.30837988826815643, "acc_stderr": 0.015445716910998875, "acc_norm": 0.30837988826815643, "acc_norm_stderr": 0.015445716910998875 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6045751633986928, "acc_stderr": 0.027996723180631452, "acc_norm": 0.6045751633986928, "acc_norm_stderr": 0.027996723180631452 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6109324758842444, "acc_stderr": 0.027690337536485376, "acc_norm": 0.6109324758842444, "acc_norm_stderr": 0.027690337536485376 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5895061728395061, "acc_stderr": 0.027371350925124764, "acc_norm": 0.5895061728395061, "acc_norm_stderr": 0.027371350925124764 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.40070921985815605, "acc_stderr": 0.029233465745573086, "acc_norm": 0.40070921985815605, "acc_norm_stderr": 0.029233465745573086 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.378748370273794, "acc_stderr": 0.012389052105003732, "acc_norm": 0.378748370273794, "acc_norm_stderr": 0.012389052105003732 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4485294117647059, "acc_stderr": 0.030211479609121603, "acc_norm": 0.4485294117647059, "acc_norm_stderr": 0.030211479609121603 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5163398692810458, "acc_stderr": 0.020217030653186457, "acc_norm": 0.5163398692810458, "acc_norm_stderr": 0.020217030653186457 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910508, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910508 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6448979591836734, "acc_stderr": 0.030635655150387634, "acc_norm": 0.6448979591836734, "acc_norm_stderr": 0.030635655150387634 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7213930348258707, "acc_stderr": 0.031700561834973086, "acc_norm": 0.7213930348258707, "acc_norm_stderr": 0.031700561834973086 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.8, "acc_stderr": 0.04020151261036845, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-virology|5": { "acc": 0.4759036144578313, "acc_stderr": 0.03887971849597264, "acc_norm": 0.4759036144578313, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7777777777777778, "acc_stderr": 0.031885780176863984, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.031885780176863984 }, "harness|truthfulqa:mc|0": { "mc1": 0.24969400244798043, "mc1_stderr": 0.015152286907148128, "mc2": 0.4031296675879943, "mc2_stderr": 0.014303460791454664 }, "harness|winogrande|5": { "acc": 0.7719021310181531, "acc_stderr": 0.011793015817663592 }, "harness|gsm8k|5": { "acc": 0.25473843821076575, "acc_stderr": 0.01200173123287914 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-13b-fast
[ "region:us" ]
2024-01-11T10:59:16+00:00
{"pretty_name": "Evaluation run of elyza/ELYZA-japanese-Llama-2-13b-fast", "dataset_summary": "Dataset automatically created during the evaluation run of model [elyza/ELYZA-japanese-Llama-2-13b-fast](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-13b-fast) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-13b-fast\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T10:56:54.496975](https://huggingface.co/datasets/open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-13b-fast/blob/main/results_2024-01-11T10-56-54.496975.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5436158357444296,\n \"acc_stderr\": 0.03390716973340485,\n \"acc_norm\": 0.5486678144243484,\n \"acc_norm_stderr\": 0.03463583392217699,\n \"mc1\": 0.24969400244798043,\n \"mc1_stderr\": 0.015152286907148128,\n \"mc2\": 0.4031296675879943,\n \"mc2_stderr\": 0.014303460791454664\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5247440273037542,\n \"acc_stderr\": 0.01459348769493774,\n \"acc_norm\": 0.5588737201365188,\n \"acc_norm_stderr\": 0.014509747749064663\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.603963353913563,\n \"acc_stderr\": 0.004880726787988636,\n \"acc_norm\": 0.8073093009360686,\n \"acc_norm_stderr\": 0.00393606145515111\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.040633027314866704,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.040633027314866704\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009787,\n \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009787\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n \"acc_stderr\": 0.03804749744364763,\n \"acc_norm\": 0.4682080924855491,\n \"acc_norm_stderr\": 0.03804749744364763\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3544973544973545,\n \"acc_stderr\": 0.024636830602842,\n \"acc_norm\": 0.3544973544973545,\n \"acc_norm_stderr\": 0.024636830602842\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n \"acc_stderr\": 0.027273890594300645,\n \"acc_norm\": 0.6419354838709678,\n \"acc_norm_stderr\": 0.027273890594300645\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419872,\n \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512568,\n \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512568\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.02977866303775296,\n \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.02977866303775296\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4897435897435897,\n \"acc_stderr\": 0.025345672221942374,\n \"acc_norm\": 0.4897435897435897,\n \"acc_norm_stderr\": 0.025345672221942374\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.02763490726417854,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.02763490726417854\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182088,\n \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182088\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.710091743119266,\n \"acc_stderr\": 0.019453066609201597,\n \"acc_norm\": 0.710091743119266,\n \"acc_norm_stderr\": 0.019453066609201597\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.032282103870378935,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.032282103870378935\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.028756799629658335,\n \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.028756799629658335\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.0426073515764456,\n \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.0426073515764456\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.040655781409087044,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.040655781409087044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n \"acc_stderr\": 0.025598193686652237,\n \"acc_norm\": 0.811965811965812,\n \"acc_norm_stderr\": 0.025598193686652237\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7458492975734355,\n \"acc_stderr\": 0.015569254692045752,\n \"acc_norm\": 0.7458492975734355,\n \"acc_norm_stderr\": 0.015569254692045752\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30837988826815643,\n \"acc_stderr\": 0.015445716910998875,\n \"acc_norm\": 0.30837988826815643,\n \"acc_norm_stderr\": 0.015445716910998875\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631452,\n \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631452\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n \"acc_stderr\": 0.027690337536485376,\n \"acc_norm\": 0.6109324758842444,\n \"acc_norm_stderr\": 0.027690337536485376\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5895061728395061,\n \"acc_stderr\": 0.027371350925124764,\n \"acc_norm\": 0.5895061728395061,\n \"acc_norm_stderr\": 0.027371350925124764\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573086,\n \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573086\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.378748370273794,\n \"acc_stderr\": 0.012389052105003732,\n \"acc_norm\": 0.378748370273794,\n \"acc_norm_stderr\": 0.012389052105003732\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121603,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121603\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.020217030653186457,\n \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.020217030653186457\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387634,\n \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387634\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24969400244798043,\n \"mc1_stderr\": 0.015152286907148128,\n \"mc2\": 0.4031296675879943,\n \"mc2_stderr\": 0.014303460791454664\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663592\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.25473843821076575,\n \"acc_stderr\": 0.01200173123287914\n }\n}\n```", "repo_url": "https://huggingface.co/elyza/ELYZA-japanese-Llama-2-13b-fast", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|arc:challenge|25_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|gsm8k|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hellaswag|10_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T10-56-54.496975.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["**/details_harness|winogrande|5_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T10-56-54.496975.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T10_56_54.496975", "path": ["results_2024-01-11T10-56-54.496975.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T10-56-54.496975.parquet"]}]}]}
2024-01-11T10:59:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of elyza/ELYZA-japanese-Llama-2-13b-fast Dataset automatically created during the evaluation run of model elyza/ELYZA-japanese-Llama-2-13b-fast on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T10:56:54.496975(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of elyza/ELYZA-japanese-Llama-2-13b-fast\n\n\n\nDataset automatically created during the evaluation run of model elyza/ELYZA-japanese-Llama-2-13b-fast on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T10:56:54.496975(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of elyza/ELYZA-japanese-Llama-2-13b-fast\n\n\n\nDataset automatically created during the evaluation run of model elyza/ELYZA-japanese-Llama-2-13b-fast on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T10:56:54.496975(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
096b192950eda5b53db2569cc22e390a5dadefa7
# CLIP and T5 tokenization of CC3M
israfelsr/tokenized_cc3m
[ "task_categories:text-generation", "size_categories:1M<n<10M", "language:en", "license:mit", "region:us" ]
2024-01-11T11:02:22+00:00
{"language": ["en"], "license": "mit", "size_categories": ["1M<n<10M"], "task_categories": ["text-generation"], "pretty_name": "CLIP and T5 tokenization of CC3M", "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "clip_ids", "sequence": "int64"}, {"name": "clip_attention_mask", "sequence": "int64"}, {"name": "t5_ids", "sequence": "int64"}, {"name": "t5_attention_mask", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 31520132297, "num_examples": 3318333}, {"name": "validation", "num_bytes": 150459428, "num_examples": 15840}], "download_size": 362821979, "dataset_size": 31670591725}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]}
2024-01-11T11:17:45+00:00
[]
[ "en" ]
TAGS #task_categories-text-generation #size_categories-1M<n<10M #language-English #license-mit #region-us
# CLIP and T5 tokenization of CC3M
[ "# CLIP and T5 tokenization of CC3M" ]
[ "TAGS\n#task_categories-text-generation #size_categories-1M<n<10M #language-English #license-mit #region-us \n", "# CLIP and T5 tokenization of CC3M" ]
6b1c30b74ffe1da23372748d382ce853d8851841
# Dataset Card for "arxiv-2048" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
anumafzal94/arxiv-2048
[ "region:us" ]
2024-01-11T11:06:41+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 216556336, "num_examples": 6438}, {"name": "train", "num_bytes": 176945972.1572892, "num_examples": 5000}, {"name": "validation", "num_bytes": 7324400.389804166, "num_examples": 218}], "download_size": 16706049, "dataset_size": 400826708.5470934}}
2024-01-12T03:23:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for "arxiv-2048" More Information needed
[ "# Dataset Card for \"arxiv-2048\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"arxiv-2048\"\n\nMore Information needed" ]
e2b366129a79d4197f87daaa9a852ff2b142b387
{"id": "130042945016-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062c\u0645\u064a\u0639 \u0637\u0644\u0628\u0627\u062a \u0639\u0642\u0648\u062f \u062a\u0623\u0633\u064a\u0633 \u0627\u0644\u0634\u0631\u0643\u0627\u062a \u062d\u0633\u0628 \u0627\u0644\u0643\u064a\u0627\u0646 : \n \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u064f\u062a\u0645\u0651\u0643\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u064f\u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a\u060c \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0645\u062d\u062f\u0648\u062f\u0629 \u0645\u0646 \u0634\u062e\u0635 \n \u0648\u0627\u062d\u062f\u060c \u0623\u0648 \u0623\u0643\u062b\u0631 \u0645\u0646 \u0630\u0648\u064a \u0627\u0644\u0635\u0641\u0629 \u0627\u0644\u0637\u0628\u064a\u0639\u064a\u0629 \u0623\u0648 \u0627\u0644\u0625\u0639\u062a\u0628\u0627\u0631\u064a\u0629\u060c \u0648\u062a\u0639\u062f \u0630\u0645\u062a\u0647\u0627 \u0645\u0633\u062a\u0642\u0644\u0629 \u0639\u0646 \u0627\u0644\u0630\u0645\u0629 \u0627\u0644\u0645\u0627\u0644\u064a\u0629 \u0644\u0643\u0644 \u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627. \n \u0648\u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u062d\u062f\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0629 \u0639\u0646 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0645\u062a\u0631\u062a\u0628\u0629 \u0639\u0644\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0646\u0627\u0634\u0626\u0629 \u0639\u0646 \u0646\u0634\u0627\u0637\u0647\u0627\u060c \u0648\u0644\u0627 \u064a\u0643\u0648\u0646 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627 \u0648\u0644\u0627 \n \u0627\u0644\u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0622\u064b \u0639\u0646 \u0647\u0630\u0647 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0625\u0644\u0627 \u0628\u0642\u062f\u0631 \u062d\u0635\u062a\u0647 \u0641\u064a \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 . \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 :", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05671c"} {"id": "130042945016-1", "text": "\u0627\u0644\u062a\u0636\u0627\u0645\u0646 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0636\u0627\u0645\u0646\u064a\u0629 \u062c\u0645\u064a\u0639 \u0634\u0631\u0643\u0627\u0624\u0647\u0627 \u0623\u0641\u0631\u0627\u062f \n \u0648\u0645\u0633\u0624\u0648\u0644\u064a\u0646 \u0634\u062e\u0635\u064a\u0627\u064b \u0628\u062c\u0645\u064a\u0639 \u0623\u0645\u0648\u0627\u0644\u0647\u0645 \u0648\u0628\u0627\u0644\u062a\u0636\u0627\u0645\u0646 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a\u0647\u0627\u060c \n \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0648\u0635\u064a\u0629 \u0628\u0633\u064a\u0637\u0629 \u062a\u062a\u0643\u0648\u0646 \u0645\u0646 \u0641\u0631\u064a\u0642\u064a\u0646\u060c \n (\u0627\u0644\u0645\u062a\u0636\u0627\u0645\u0646) \u0648\u0647\u0648 \u0627\u0644\u0645\u0633\u0624\u0648\u0644 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0634\u0631\u0643\u0629\u060c \u0648(\u0627\u0644\u0645\u0648\u0635\u064a) \u0648\u0647\u0648 \u0627\u0644\u0630\u064a \u0644\u0627 \u064a\u0643\u0648\u0646 \u0645\u0633\u0624\u0648\u0644\u0627\u064b \u0625\u0644\u0627 \u0641\u064a \u062d\u062f\u0648\u062f \u062d\u0635\u062a\u0647 \u0641\u064a \n \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022 \u064a\u062c\u0628 \u0623\u0646 \u064a\u0643\u0648\u0646 \u0627\u0644\u0634\u062e\u0635 \u0627\u0644\u0637\u0628\u064a\u0639\u064a \u0623\u0643\u0628\u0631 \u0645\u0646 18 \u0639\u0627\u0645\u060c \u0648\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0642\u0627\u0635\u0631\u064b\u0627 \u064a\u062a\u0645 \u0625\u0631\u0641\u0627\u0642 \u0635\u0643 \u0627\u0644\u0648\u0644\u0627\u064a\u0629 .", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05671c"} {"id": "130042945016-2", "text": "\u2022 \u064a\u062c\u0628 \u0623\u0646 \u0623\u0644\u0627 \u064a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0645\u0648\u0638\u0641\u064a\u0646 \u062d\u0643\u0648\u0645\u064a\u064a\u0646 . \n \u2022 \u0627\u0644\u062a\u062d\u0642\u0651\u064f\u0642 \u0645\u0646 \u0642\u0627\u0639\u062f\u0629 \u0627\u0644\u0639\u0645\u0644 \u0627\u0644\u062e\u0627\u0635\u0629 \u0628\u0628\u0639\u0636 \u0639\u0648\u0627\u0626\u0644 \u0646\u062c\u0631\u0627\u0646 . \n \u2022 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0634\u0631\u064a\u0643 \u0625\u0639\u062a\u0628\u0627\u0631\u064a \u064a\u062a\u0645 \u0627\u0644\u062a\u062d\u0642\u0642 \u0645\u0646 \u0623\u0646 \u064a\u0643\u0648\u0646 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u063a\u064a\u0631 \u0645\u0634\u0637\u0648\u0628 \u0623\u0648 \u0645\u0648\u0642\u0648\u0641 \u0623\u0648 \u0645\u0646\u062a\u0647\u064a . \n \u2022 \u0641\u064a \u062d\u0627\u0644 \u0648\u062c\u0648\u062f \u0634\u0631\u064a\u0643 \u0623\u062c\u0646\u0628\u064a \u064a\u062c\u0628 \u0627\u0644\u062a\u062d\u0642\u0651\u064f\u0642 \u0645\u0646 \u0648\u062c\u0648\u062f \u0631\u062e\u0635\u0629 \u0625\u0633\u062a\u062b\u0645\u0627\u0631 \u0645\u0646 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0625\u0633\u062a\u062b\u0645\u0627\u0631 . \n \u2022 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u062c\u0647\u0629 \u062d\u0643\u0648\u0645\u064a\u0629/\u0645\u0624\u0633\u0633\u0629 \u0623\u0647\u0644\u064a\u0629/\u062c\u0645\u0639\u064a\u0629 \u062e\u064a\u0631\u064a\u0629/ \u0648\u0642\u0641 \"\u064a\u062c\u0628 \u0648\u062c\u0648\u062f \u0633\u0646\u062f \u0646\u0638\u0627\u0645\u064a \u064a\u062e\u0648\u0644\u0647\u0627 \u0628\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0623\u0648 \u0627\u0644\u0645\u0634\u0627\u0631\u0643\u0629 \u0641\u064a \u0634\u0631\u0643\u0629 \". \n \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0642\u0631\u064a\u0631 \u0627\u0644\u0645\u0642\u064a\u0645 \u0627\u0644\u0645\u0639\u062a\u0645\u062f \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0631\u0627\u0633 \u0627\u0644\u0645\u0627\u0644 \u0639\u064a\u0646\u064a. \n \u2022\u0627\u0644\u0645\u0633\u062a\u0646\u062f\u0627\u062a \u0627\u0644\u0645\u0637\u0644\u0648\u0628\u0629 : \n \u2022 \u0625\u0631\u0641\u0627\u0642 \u062a\u0631\u062e\u064a\u0635 \u0645\u0646 \u0627\u0644\u0628\u0646\u0643 \u0627\u0644\u0645\u0631\u0643\u0632\u064a \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0627\u0644\u0646\u0634\u0627\u0637 \u064a\u062a\u0637\u0644\u0628 \u0630\u0644\u0643.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05671c"} {"id": "130042945016-3", "text": "\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0625\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC) . \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0627\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u0646\u0648\u0639 \u0648\u0635\u0641\u0629 \u0627\u0644\u0634\u0631\u0643\u0629 (\u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 - \u0627\u0644\u062a\u0636\u0627\u0645\u0646 - \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629) . \n 4\u2022\u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 5\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 6\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022\u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 8\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0631\u0629 . \n 9\u2022\u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0648\u062b\u0627\u0626\u0642 \u0648\u0637\u0628\u0627\u0639\u0629 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u0639\u0642\u062f \u0627\u0644\u0634\u0631\u0643\u0629 \u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : 1200 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 - \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629: 800 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 \u0644\u0644\u0643\u064a\u0627\u0646\u0627\u062a: 500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% .", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05671c"} {"id": "130042945016-4", "text": "\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=5 \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 : \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=3 \n \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629: \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=4/n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05671c"} {"id": "952da374a2f2-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u064f\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062c\u0645\u064a\u0639 \u0637\u0644\u0628\u0627\u062a \u0639\u0642\u0648\u062f \u062a\u0623\u0633\u064a\u0633 \u0627\u0644\u0634\u0631\u0643\u0627\u062a \u0627\u0644\u0645\u0647\u0646\u064a\u0629 \u0628\u0645\u0648\u062c\u0628 \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0627\u0644\u0645\u0647\u0646\u064a \u0627\u0644\u0635\u0627\u062f\u0631 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u064a\u062a\u0645 \u0627\u0644\u062a\u062d\u0642\u0651\u064f\u0642 \u0645\u0646 \u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629 \u0644\u0644\u0634\u0631\u0643\u0627\u062a \u0627\u0644\u062a\u062c\u0627\u0631\u064a\u0629\u060c \u0628\u0627\u0644\u0625\u0636\u0627\u0641\u0629 \u0625\u0644\u0649 \u0627\u0644\u062a\u0627\u0644\u064a: \n \u2022\u064a\u062c\u0628 \u0623\u0646 \u062a\u062a\u0648\u0641\u0631 \u0631\u062e\u0635\u0629 \u0645\u0647\u0646\u064a\u0629 \u0633\u0627\u0631\u064a\u0629 \u0644\u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0648\u0642\u062a \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628\u060c \u0648\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u063a\u064a\u0631 \u0633\u0639\u0648\u062f\u064a \u064a\u062c\u0628 \u0625\u0631\u0641\u0627\u0642 \u062a\u0631\u062e\u064a\u0635 \u0645\u0647\u0646\u064a \u0635\u0627\u062f\u0631 \u0645\u0646 \u062f\u0627\u062e\u0644 \u0627\u0644\u0645\u0645\u0644\u0643\u0629. \n \u2022 \u0623\u0644\u0627 \u062a\u0642\u0644 \u0646\u0633\u0628\u0629 \u0627\u0644\u0634\u0631\u064a\u0643 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0627\u0644\u0645\u0631\u062e\u0651\u064e\u0635 \u0639\u0646 (25%) \u0645\u0646 \u0631\u0623\u0633 \u0645\u0627\u0644 \u0627\u0644\u0634\u0631\u0643\u0629 \u0627\u0644\u0645\u0647\u0646\u064a\u0629 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u062e\u062a\u0644\u0637\u0629. \n \u2022 \u0623\u0644\u0627 \u062a\u0642\u0644 \u0646\u0633\u0628\u0629 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0627\u0644\u0645\u0631\u062e\u0651\u064e\u0635\u064a\u0646 \u0639\u0646 (70%) \u0644\u0633\u0639\u0648\u062f\u064a \u0648\u0627\u0644\u062e\u0644\u064a\u062c\u064a . \n \u2022 \u0625\u0631\u0641\u0627\u0642 \u062a\u0642\u0631\u064a\u0631 \u0627\u0644\u0645\u0642\u064a\u0645 \u0627\u0644\u0645\u0639\u062a\u0645\u062f \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0631\u0627\u0633 \u0627\u0644\u0645\u0627\u0644 \u0639\u064a\u0646\u064a.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056720"} {"id": "952da374a2f2-1", "text": "\u2022\u0623\u0646\u0648\u0627\u0639 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0641\u064a \u0639\u0642\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 : \n \u2022\u0634\u0631\u064a\u0643 \u0645\u0631\u062e\u0635 . \n \u2022\u0634\u0631\u064a\u0643 \u0628\u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 . \n \u2022\u0634\u0631\u064a\u0643 \u0628\u0627\u0644\u0639\u0645\u0644 ,\n\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0625\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC) . \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u0646\u0648\u0639 \u0648\u0635\u0641\u0629 \u0627\u0644\u0634\u0631\u0643\u0629 (\u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629-\u0627\u0644\u062a\u0636\u0627\u0645\u0646-\u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629) \u0645\u0647\u0646\u064a\u0629 . \n 4\u2022\u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 5\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 6\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022\u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 8\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0631\u0629 . \n 9\u2022\u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0648\u062b\u0627\u0626\u0642 \u0648\u0637\u0628\u0627\u0639\u0629 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u0639\u0642\u062f \u0627\u0644\u0634\u0631\u0643\u0629 \u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629: 1200 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056720"} {"id": "952da374a2f2-2", "text": "\u0627\u0644\u062a\u0636\u0627\u0645\u0646-\u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629: 800 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u064a\u0636\u0627\u0641 \u0639\u0644\u064a\u0647\u0627 500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u0631\u0633\u0648\u0645 \u0646\u0634\u0631 + 15% \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 .\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n \u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0627\u0644\u062a\u0636\u0627\u0645\u0646 : \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=3 \n \u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629: \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=5 \n \u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629: \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=4/n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056720"} {"id": "fd131aa6e7d9-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062c\u0645\u064a\u0639 \u0637\u0644\u0628\u0627\u062a \u0639\u0642\u0648\u062f \u062a\u0623\u0633\u064a\u0633 \u0627\u0644\u0634\u0631\u0643\u0627\u062a \u062d\u0633\u0628 \u0627\u0644\u0643\u064a\u0627\u0646 : \n \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u064f\u062a\u0645\u0651\u0643\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u064f\u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a\u060c \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0645\u062d\u062f\u0648\u062f\u0629 \u0645\u0646 \u0634\u062e\u0635 \n \u0648\u0627\u062d\u062f\u060c \u0623\u0648 \u0623\u0643\u062b\u0631 \u0645\u0646 \u0630\u0648\u064a \u0627\u0644\u0635\u0641\u0629 \u0627\u0644\u0637\u0628\u064a\u0639\u064a\u0629 \u0623\u0648 \u0627\u0644\u0625\u0639\u062a\u0628\u0627\u0631\u064a\u0629\u060c \u0648\u062a\u0639\u062f \u0630\u0645\u062a\u0647\u0627 \u0645\u0633\u062a\u0642\u0644\u0629 \u0639\u0646 \u0627\u0644\u0630\u0645\u0629 \u0627\u0644\u0645\u0627\u0644\u064a\u0629 \u0644\u0643\u0644 \u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627. \n \u0648\u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u062d\u062f\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0629 \u0639\u0646 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0645\u062a\u0631\u062a\u0628\u0629 \u0639\u0644\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0646\u0627\u0634\u0626\u0629 \u0639\u0646 \u0646\u0634\u0627\u0637\u0647\u0627\u060c \u0648\u0644\u0627 \u064a\u0643\u0648\u0646 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627 \u0648\u0644\u0627 \n \u0627\u0644\u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0622\u064b \u0639\u0646 \u0647\u0630\u0647 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0625\u0644\u0627 \u0628\u0642\u062f\u0631 \u062d\u0635\u062a\u0647 \u0641\u064a \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 . \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 :", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056721"} {"id": "fd131aa6e7d9-1", "text": "\u0627\u0644\u062a\u0636\u0627\u0645\u0646 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0636\u0627\u0645\u0646\u064a\u0629 \u062c\u0645\u064a\u0639 \u0634\u0631\u0643\u0627\u0624\u0647\u0627 \u0623\u0641\u0631\u0627\u062f \n \u0648\u0645\u0633\u0624\u0648\u0644\u064a\u0646 \u0634\u062e\u0635\u064a\u0627\u064b \u0628\u062c\u0645\u064a\u0639 \u0623\u0645\u0648\u0627\u0644\u0647\u0645 \u0648\u0628\u0627\u0644\u062a\u0636\u0627\u0645\u0646 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a\u0647\u0627\u060c \n \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0648\u0635\u064a\u0629 \u0628\u0633\u064a\u0637\u0629 \u062a\u062a\u0643\u0648\u0646 \u0645\u0646 \u0641\u0631\u064a\u0642\u064a\u0646\u060c \n (\u0627\u0644\u0645\u062a\u0636\u0627\u0645\u0646) \u0648\u0647\u0648 \u0627\u0644\u0645\u0633\u0624\u0648\u0644 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0634\u0631\u0643\u0629\u060c \u0648(\u0627\u0644\u0645\u0648\u0635\u064a) \u0648\u0647\u0648 \u0627\u0644\u0630\u064a \u0644\u0627 \u064a\u0643\u0648\u0646 \u0645\u0633\u0624\u0648\u0644\u0627\u064b \u0625\u0644\u0627 \u0641\u064a \u062d\u062f\u0648\u062f \u062d\u0635\u062a\u0647 \u0641\u064a \n \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0631\u062e\u064a\u0635 \u0645\u0646 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0627\u0633\u062a\u062b\u0645\u0627\u0631. \n \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0642\u0631\u064a\u0631 \u0627\u0644\u0645\u0642\u064a\u0645 \u0627\u0644\u0645\u0639\u062a\u0645\u062f \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0631\u0627\u0633 \u0627\u0644\u0645\u0627\u0644 \u0639\u064a\u0646\u064a.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056721"} {"id": "fd131aa6e7d9-2", "text": "\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC) . \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0627\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u0646\u0648\u0639 \u0648\u0635\u0641\u0629 \u0627\u0644\u0634\u0631\u0643\u0629 ( \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 - \u0627\u0644\u062a\u0636\u0627\u0645\u0646 - \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629) \u060c \u0623\u062c\u0646\u0628\u064a\u0629 \n 4\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 5\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 6\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022\u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0627\u0637\u0631\u0627\u0641 . \n 8\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0631\u0629 . \n 9\u2022\u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0648\u062b\u0627\u0626\u0642 \u0648\u0637\u0628\u0627\u0639\u0629 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u0639\u0642\u062f \u0627\u0644\u0634\u0631\u0643\u0629 \u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a\u064b\u0627 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : 1200 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 : 500 \u0648 \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 : 15%. \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 \u0648 \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 : 800 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056721"} {"id": "fd131aa6e7d9-3", "text": "\u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 : 500 \u0648\u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 : 15%.\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=143 \n * \u0639\u0646\u062f \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u0638\u0647\u0631 \u0644\u0644\u0639\u0645\u0644 \u0646\u0648\u0639 \u0627\u0644\u0643\u064a\u0627\u0646 \u0627\u0644\u0645\u0637\u0644\u0648\u0628 \u0639\u0646\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 : (\u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629-\u0627\u0644\u062a\u0636\u0627\u0645\u0646-\u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629) \u060c \u0623\u062c\u0646\u0628\u064a\u0629 /n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056721"} {"id": "3742980811a3-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062c\u0645\u064a\u0639 \u0637\u0644\u0628\u0627\u062a \u0639\u0642\u0648\u062f \u062a\u0623\u0633\u064a\u0633 \u0627\u0644\u0634\u0631\u0643\u0627\u062a \u062d\u0633\u0628 \u0627\u0644\u0643\u064a\u0627\u0646 : \n \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u064f\u062a\u0645\u0651\u0643\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u064f\u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a\u060c \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0645\u062d\u062f\u0648\u062f\u0629 \u0645\u0646 \u0634\u062e\u0635 \n \u0648\u0627\u062d\u062f\u060c \u0623\u0648 \u0623\u0643\u062b\u0631 \u0645\u0646 \u0630\u0648\u064a \u0627\u0644\u0635\u0641\u0629 \u0627\u0644\u0637\u0628\u064a\u0639\u064a\u0629 \u0623\u0648 \u0627\u0644\u0625\u0639\u062a\u0628\u0627\u0631\u064a\u0629\u060c \u0648\u062a\u0639\u062f \u0630\u0645\u062a\u0647\u0627 \u0645\u0633\u062a\u0642\u0644\u0629 \u0639\u0646 \u0627\u0644\u0630\u0645\u0629 \u0627\u0644\u0645\u0627\u0644\u064a\u0629 \u0644\u0643\u0644 \u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627. \n \u0648\u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u062d\u062f\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0629 \u0639\u0646 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0645\u062a\u0631\u062a\u0628\u0629 \u0639\u0644\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0646\u0627\u0634\u0626\u0629 \u0639\u0646 \u0646\u0634\u0627\u0637\u0647\u0627\u060c \u0648\u0644\u0627 \u064a\u0643\u0648\u0646 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627 \u0648\u0644\u0627 \n \u0627\u0644\u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0622\u064b \u0639\u0646 \u0647\u0630\u0647 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0625\u0644\u0627 \u0628\u0642\u062f\u0631 \u062d\u0635\u062a\u0647 \u0641\u064a \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 . \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 :", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056722"} {"id": "3742980811a3-1", "text": "\u0627\u0644\u062a\u0636\u0627\u0645\u0646 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0636\u0627\u0645\u0646\u064a\u0629 \u062c\u0645\u064a\u0639 \u0634\u0631\u0643\u0627\u0624\u0647\u0627 \u0623\u0641\u0631\u0627\u062f \n \u0648\u0645\u0633\u0624\u0648\u0644\u064a\u0646 \u0634\u062e\u0635\u064a\u0627\u064b \u0628\u062c\u0645\u064a\u0639 \u0623\u0645\u0648\u0627\u0644\u0647\u0645 \u0648\u0628\u0627\u0644\u062a\u0636\u0627\u0645\u0646 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a\u0647\u0627\u060c \n \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0648\u0635\u064a\u0629 \u0628\u0633\u064a\u0637\u0629 \u062a\u062a\u0643\u0648\u0646 \u0645\u0646 \u0641\u0631\u064a\u0642\u064a\u0646\u060c \n (\u0627\u0644\u0645\u062a\u0636\u0627\u0645\u0646) \u0648\u0647\u0648 \u0627\u0644\u0645\u0633\u0624\u0648\u0644 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0634\u0631\u0643\u0629\u060c \u0648(\u0627\u0644\u0645\u0648\u0635\u064a) \u0648\u0647\u0648 \u0627\u0644\u0630\u064a \u0644\u0627 \u064a\u0643\u0648\u0646 \u0645\u0633\u0624\u0648\u0644\u0627\u064b \u0625\u0644\u0627 \u0641\u064a \u062d\u062f\u0648\u062f \u062d\u0635\u062a\u0647 \u0641\u064a \n \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0631\u062e\u064a\u0635 \u0645\u0646 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0627\u0633\u062a\u062b\u0645\u0627\u0631. \n \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0642\u0631\u064a\u0631 \u0627\u0644\u0645\u0642\u064a\u0645 \u0627\u0644\u0645\u0639\u062a\u0645\u062f \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0631\u0627\u0633 \u0627\u0644\u0645\u0627\u0644 \u0639\u064a\u0646\u064a.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056722"} {"id": "3742980811a3-2", "text": "\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC) . \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u0646\u0648\u0639 \u0648\u0635\u0641\u0629 \u0627\u0644\u0634\u0631\u0643\u0629 ( \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 - \u0627\u0644\u062a\u0636\u0627\u0645\u0646 - \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 ) . \u0645\u062e\u062a\u0644\u0637\u0629 . \n 4\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 5\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 6\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022\u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0627\u0637\u0631\u0627\u0641 . \n 8\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0631\u0629 . \n 9\u2022\u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0648\u062b\u0627\u0626\u0642 \u0648\u0637\u0628\u0627\u0639\u0629 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u0639\u0642\u062f \u0627\u0644\u0634\u0631\u0643\u0629 \u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a\u064b\u0627 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629: 1200 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 \u0648\u0627\u0644\u062a\u0636\u0627\u0645\u0646: 800 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056722"} {"id": "3742980811a3-3", "text": "\u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 \u0644\u0644\u0643\u064a\u0627\u0646\u0627\u062a: 500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% .\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a : \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=143 \n \u0639\u0646\u062f \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u0638\u0647\u0631 \u0644\u0644\u0639\u0645\u0644 \u0646\u0648\u0639 \u0627\u0644\u0643\u064a\u0627\u0646 \u0627\u0644\u0645\u0637\u0644\u0648\u0628 \u0639\u0646\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 : (\u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629-\u0627\u0644\u062a\u0636\u0627\u0645\u0646\u064a\u0629-\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629) \u060c \u0645\u062e\u062a\u0644\u0637\u0629 /n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056722"} {"id": "d1ca31729b6e-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u0648\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 \u0644\u0644\u0634\u0631\u0643\u0629 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0647\u064a\u0626\u0629 \u0627\u0644\u0632\u0643\u0627\u0629 \u0648 \u0627\u0644\u0636\u0631\u064a\u0628\u0629 \u0648 \u0627\u0644\u062c\u0645\u0627\u0631\u0643 . \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0639\u062f\u0644. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629. \n \u2022 \u0645\u0644\u0627\u062d\u0638\u0629: \n \u2022 \u0641\u064a \u062d\u0627\u0644\u0629 \u0625\u0636\u0627\u0641\u0629 / \u062d\u0630\u0641 \u0628\u064a\u0627\u0646\u0627\u062a \u0634\u0631\u064a\u0643 \u0628\u0646\u0627\u0621\u064b \u0639\u0644\u0649 \u062d\u0643\u0645 \u0642\u0636\u0627\u0626\u064a \u0623\u0648 \u0635\u0643 \u0648\u0631\u062b\u0629 \u0623\u0648 \u0648\u062c\u0648\u062f \u0634\u0631\u064a\u0643 \u0642\u0627\u0635\u0631 \u062a\u0643\u0648\u0646 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0645\u0646 \u062e\u0644\u0627\u0644 \u0645\u064f\u0639\u062a\u0645\u062f \u0627\u0644\u0648\u0632\u0627\u0631\u0629 \u0623\u0648 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644.\n\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a .", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056723"} {"id": "d1ca31729b6e-1", "text": "3\u2022 \u062a\u062d\u062f\u064a\u062f \u062e\u062f\u0645\u0629 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u062f \u0634\u0631\u0643\u0629 . \n 4\u2022 \u062a\u062d\u062f\u064a\u062f \u0633\u0628\u0628 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 . \n 5\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 6\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 8\u2022 \u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 9\u2022 \u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0629 . \n 10\u2022 \u062a\u062d\u062f\u064a\u062b \u0627\u0644\u0648\u062b\u0627\u0626\u0642 (\u0646\u0638\u0627\u0645 \u0627\u0644\u0623\u0633\u0627\u0633/\u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a ) . \n 11\u2022\u0645\u0644\u0627\u062d\u0638\u0629 : \n 12\u2022\u0628\u0625\u0645\u0643\u0627\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u062d\u0627\u0644\u0629 \u0627\u0644\u0637\u0644\u0628 ( \u0628\u0625\u0646\u062a\u0638\u0627\u0631 \u0627\u0644\u062a\u062d\u0642\u0642 \u0645\u0646 \u0627\u0644\u0628\u064a\u0627\u0646\u0627\u062a ) \u0628\u0625\u062a\u0628\u0627\u0639 \u0627\u0644\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062a\u0627\u0644\u064a\u0647 : \n 13\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 14\u2022\u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0628\u0631 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 15\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0637\u0644\u0628\u0627\u062a\u064a . \n 16\u2022\u0625\u0636\u0627\u0641\u0629 \u0631\u0642\u0645 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u0625\u064a\u0642\u0648\u0646\u0629 (\u0628\u062d\u062b \u0645\u062a\u0642\u062f\u0645 ) \u0641\u064a \u0623\u0639\u0644\u0649 \u0627\u0644\u0635\u0641\u062d\u0629 .", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056723"} {"id": "d1ca31729b6e-2", "text": "17\u2022\u0628\u0639\u062f \u0638\u0647\u0648\u0631 \u0627\u0644\u0637\u0644\u0628 \u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0625\u062c\u0631\u0627\u0621\u0627\u062a \u0648\u0627\u062e\u062a\u064a\u0627\u0631 ( \u062a\u0641\u0627\u0635\u064a\u0644 ) . \n 18\u2022\u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0631\u0633\u0648\u0645 \u0646\u0634\u0631 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0639\u0642\u062f : 1500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a . \n \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% . \n \u0631\u0633\u0648\u0645 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a: 100 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a .\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=140/n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056723"} {"id": "a087ade56ad1-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u0648\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 \u0644\u0644\u0634\u0631\u0643\u0629 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0641\u064a \u062d\u0627\u0644 \u062a\u0639\u062f\u064a\u0644 \u0645\u062c\u0644\u0633 \u0627\u0644\u0645\u062f\u064a\u0631\u064a\u0646 \u0623\u0648 \u0627\u0644\u0625\u062f\u0627\u0631\u0629 \u064a\u062c\u0628 \u0645\u0631\u0627\u0639\u0627\u0629 \u0627\u0644\u0646\u0635\u0627\u0628 \u0627\u0644\u0642\u0627\u0646\u0648\u0646\u064a . \n \u2022\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0628\u062f\u062e\u0648\u0644 \u0634\u0631\u064a\u0643 \u0645\u0647\u0646\u064a \u064a\u062c\u0628 \u0623\u0646 \u064a\u0643\u0648\u0646 \u0645\u0631\u062e\u0651\u064e\u0635. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0647\u064a\u0626\u0629 \u0627\u0644\u0632\u0643\u0627\u0629 \u0648\u0627\u0644\u0636\u0631\u064a\u0628\u0629 \u0648\u0627\u0644\u062c\u0645\u0627\u0631\u0643. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0639\u062f\u0644 . \n \u2022\u0623\u0646 \u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629 . \n \u2022\u0645\u0644\u0627\u062d\u0638\u0629: \n \u2022\u0641\u064a \u062d\u0627\u0644\u0629 \u0625\u0636\u0627\u0641\u0629 / \u062d\u0630\u0641 \u0628\u064a\u0627\u0646\u0627\u062a \u0634\u0631\u064a\u0643 \u0628\u0646\u0627\u0621\u064b \u0639\u0644\u0649 \u062d\u0643\u0645 \u0642\u0636\u0627\u0626\u064a \u0623\u0648 \u0635\u0643 \u0648\u0631\u062b\u0629 \u0623\u0648 \u0648\u062c\u0648\u062f \u0634\u0631\u064a\u0643 \u0642\u0627\u0635\u0631 \u062a\u0643\u0648\u0646 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0645\u0646 \u062e\u0644\u0627\u0644 \u0645\u064f\u0639\u062a\u0645\u062f \u0627\u0644\u0648\u0632\u0627\u0631\u0629 \u0623\u0648 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672b"} {"id": "a087ade56ad1-1", "text": "\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u062e\u062f\u0645\u0629 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u062f \u0634\u0631\u0643\u0629 . \n 4\u2022 \u062a\u062d\u062f\u064a\u062f \u0633\u0628\u0628 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 . \n 5\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 6\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 8\u2022 \u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 9\u2022 \u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0629 . \n 10\u2022 \u062a\u062d\u062f\u064a\u062b \u0627\u0644\u0648\u062b\u0627\u0626\u0642 (\u0646\u0638\u0627\u0645 \u0627\u0644\u0623\u0633\u0627\u0633/\u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a ) . \n 11\u2022\u0645\u0644\u0627\u062d\u0638\u0629 : \n 12\u2022\u0628\u0625\u0645\u0643\u0627\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u062d\u0627\u0644\u0629 \u0627\u0644\u0637\u0644\u0628 ( \u0628\u0625\u0646\u062a\u0638\u0627\u0631 \u0627\u0644\u062a\u062d\u0642\u0642 \u0645\u0646 \u0627\u0644\u0628\u064a\u0627\u0646\u0627\u062a ) \u0625\u062a\u0628\u0627\u0639 \u0627\u0644\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062a\u0627\u0644\u064a\u0647 : \n 13\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC).", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672b"} {"id": "a087ade56ad1-2", "text": "14\u2022\u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0628\u0631 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 15\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0637\u0644\u0628\u0627\u062a\u064a . \n 16\u2022\u0625\u0636\u0627\u0641\u0629 \u0631\u0642\u0645 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u0625\u064a\u0642\u0648\u0646\u0629 (\u0628\u062d\u062b \u0645\u062a\u0642\u062f\u0645 ) \u0641\u064a \u0623\u0639\u0644\u0649 \u0627\u0644\u0635\u0641\u062d\u0629 . \n 17\u2022\u0628\u0639\u062f \u0638\u0647\u0648\u0631 \u0627\u0644\u0637\u0644\u0628 \u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0625\u062c\u0631\u0627\u0621\u0627\u062a \u0648\u0623\u062e\u062a\u064a\u0627\u0631 ( \u062a\u0641\u0627\u0635\u064a\u0644 ) . \n 18\u2022\u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0631\u0633\u0648\u0645 \u0646\u0634\u0631 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0639\u0642\u062f : 1500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a\u060c \u064a\u0636\u0627\u0641 \u0639\u0644\u064a\u0647\u0627 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% . \n \u0631\u0633\u0648\u0645 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a : 100 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a..\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=140/n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672b"} {"id": "9a1a18440fc9-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u0648\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 \u0644\u0644\u0634\u0631\u0643\u0629 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0647\u064a\u0626\u0629 \u0627\u0644\u0632\u0643\u0627\u0629 \u0648 \u0627\u0644\u0636\u0631\u064a\u0628\u0629 \u0648\u0627\u0644\u062c\u0645\u0627\u0631\u0643. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0639\u062f\u0644. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629. \n \u2022\u0641\u064a \u062d\u0627\u0644\u0629 \u0625\u0636\u0627\u0641\u0629 / \u062d\u0630\u0641 \u0628\u064a\u0627\u0646\u0627\u062a \u0634\u0631\u064a\u0643 \u0628\u0646\u0627\u0621\u064b \u0639\u0644\u0649 \u062d\u0643\u0645 \u0642\u0636\u0627\u0626\u064a \u064a\u062a\u0645 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0639\u0628\u0631 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644\u060c \u0648\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0635\u0643 \u0648\u0631\u062b\u0629 \u062a\u0643\u0648\u0646 \u0639\u0628\u0631 \u0645\u064f\u0639\u062a\u0645\u062f \u0627\u0644\u0648\u0632\u0627\u0631\u0629 \u0623\u0648 \u0643\u0627\u062a\u0628 \u0639\u062f\u0644. \n \u2022\u0641\u064a \u062d\u0627\u0644 \u0625\u0636\u0627\u0641\u0629 \u0634\u0631\u064a\u0643 \u0642\u0627\u0635\u0631 \u064a\u062a\u0645 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0639\u0628\u0631 \u0643\u0627\u062a\u0628 \u0639\u062f\u0644. \n \u2022\u0625\u0631\u0641\u0627\u0642 \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0627\u0644\u0625\u0633\u062a\u062b\u0645\u0627\u0631\u064a \u0645\u0639 \u0645\u0631\u0627\u0639\u0627\u0629 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0641\u064a \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0623\u0648\u0644\u0627\u064b.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672c"} {"id": "9a1a18440fc9-1", "text": "\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0627\u0644\u0643\u062a\u0631\u0648\u0646\u064a \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u062e\u062f\u0645\u0629 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u062f \u0634\u0631\u0643\u0629 . \n 4\u2022 \u062a\u062d\u062f\u064a\u062f \u0633\u0628\u0628 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 . \n 5\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 6\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 8\u2022 \u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 9\u2022 \u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0629 . \n 10\u2022 \u062a\u062d\u062f\u064a\u062b \u0627\u0644\u0648\u062b\u0627\u0626\u0642 (\u0646\u0638\u0627\u0645 \u0627\u0644\u0623\u0633\u0627\u0633/\u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a ) . \n 11\u2022\u0645\u0644\u0627\u062d\u0638\u0629 : \n 12\u2022\u0628\u0625\u0645\u0643\u0627\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u062d\u0627\u0644\u0629 \u0627\u0644\u0637\u0644\u0628 ( \u0628\u0625\u0646\u062a\u0638\u0627\u0631 \u0627\u0644\u062a\u062d\u0642\u0642 \u0645\u0646 \u0627\u0644\u0628\u064a\u0627\u0646\u0627\u062a ) \u0628\u0625\u062a\u0628\u0627\u0639 \u0627\u0644\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062a\u0627\u0644\u064a\u0647 : \n 13\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC).", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672c"} {"id": "9a1a18440fc9-2", "text": "14\u2022\u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0628\u0631 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 15\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0637\u0644\u0628\u0627\u062a\u064a . \n 16\u2022\u0625\u0636\u0627\u0641\u0629 \u0631\u0642\u0645 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u0625\u064a\u0642\u0648\u0646\u0629 (\u0628\u062d\u062b \u0645\u062a\u0642\u062f\u0645 ) \u0641\u064a \u0623\u0639\u0644\u0649 \u0627\u0644\u0635\u0641\u062d\u0629 . \n 17\u2022\u0628\u0639\u062f \u0638\u0647\u0648\u0631 \u0627\u0644\u0637\u0644\u0628 \u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0625\u062c\u0631\u0627\u0621\u0627\u062a \u0648\u0627\u062e\u062a\u064a\u0627\u0631 ( \u062a\u0641\u0627\u0635\u064a\u0644 ) . \n 18\u2022\u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0631\u0633\u0648\u0645 \u0646\u0634\u0631 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0639\u0642\u062f : \n 1500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% . \n \u0631\u0633\u0648\u0645 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a : \n 100 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a.\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=140/n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672c"} {"id": "503dc6645068-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0623\u0633\u064a\u0633 \u0627\u0644\u0634\u0631\u0643\u0627\u062a \u0627\u0644\u0645\u0647\u0646\u064a\u0629 \u0627\u0644\u0635\u0627\u062f\u0631\u0629 \u0628\u0645\u0648\u062c\u0628 \u062a\u0631\u062e\u064a\u0635 \u0645\u0647\u0646\u064a \u0648 \u0645\u0632\u0627\u0648\u0644\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0628\u0623\u0646\u0648\u0627\u0639\u0647\u0627: (\u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629- \u0627\u0644\u062a\u0636\u0627\u0645\u0646 \u2013 \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629- \u0627\u0644\u0645\u0633\u0627\u0647\u0645\u0629)\u00a0 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u064a\u062c\u0628 \u0623\u0646 \u062a\u062a\u0648\u0641\u0631 \u0631\u062e\u0635\u0629 \u0645\u0647\u0646\u064a\u0629 \u0633\u0627\u0631\u064a\u0629 \u0644\u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0648\u0642\u062a \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n \u2022\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u063a\u064a\u0631 \u0633\u0639\u0648\u062f\u064a\u061b \u064a\u062c\u0628 \u0625\u0631\u0641\u0627\u0642 \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0627\u0644\u0645\u0647\u0646\u064a \u0627\u0644\u0635\u0627\u062f\u0631 \u0645\u0646 \u062f\u0627\u062e\u0644 \u0627\u0644\u0645\u0645\u0644\u0643\u0629. \n \u2022\u0623\u0646 \u0644\u0627 \u062a\u0642\u0644 \u0646\u0633\u0628\u0629 \u0627\u0644\u0634\u0631\u064a\u0643 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0627\u0644\u0645\u0631\u062e\u0651\u064e\u0635 \u0639\u0646 (25%) \u0645\u0646 \u0631\u0623\u0633 \u0645\u0627\u0644 \u0627\u0644\u0634\u0631\u0643\u0629 \u0627\u0644\u0645\u0647\u0646\u064a\u0629 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u062e\u062a\u0644\u0637\u0629 . \n \u2022\u0623\u0646 \u0644\u0627 \u062a\u0642\u0644 \u0646\u0633\u0628\u0629 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0627\u0644\u0645\u0631\u062e\u0651\u064e\u0635\u064a\u0646 \u0639\u0646 (70%) \u0644\u0633\u0639\u0648\u062f\u064a \u0648\u0627\u0644\u062e\u0644\u064a\u062c\u064a . \n \u2022\u0648\u062c\u0648\u062f \u062a\u0631\u062e\u064a\u0635 \u0627\u0633\u062a\u062b\u0645\u0627\u0631\u064a .", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672d"} {"id": "503dc6645068-1", "text": "\u2022\u0648\u062c\u0648\u062f \u062a\u0631\u062e\u064a\u0635 \u0627\u0633\u062a\u062b\u0645\u0627\u0631\u064a . \n \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0642\u0631\u064a\u0631 \u0627\u0644\u0645\u0642\u064a\u0645 \u0627\u0644\u0645\u0639\u062a\u0645\u062f \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0631\u0627\u0633 \u0627\u0644\u0645\u0627\u0644 \u0639\u064a\u0646\u064a. \n \u2022\u0645\u0644\u0627\u062d\u0638\u0629 : \n \u2022\u0623\u0646\u0648\u0627\u0639 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0641\u064a \u0627\u0644\u0639\u0642\u062f : \n \u2022\u0634\u0631\u064a\u0643 \u0645\u0631\u062e\u0635. \n \u2022\u0634\u0631\u064a\u0643 \u0628\u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 . \n \u2022\u0634\u0631\u064a\u0643 \u0628\u0627\u0644\u0639\u0645\u0644.\n\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0625\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC) . \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u0646\u0648\u0639 \u0648\u0635\u0641\u0629 \u0627\u0644\u0634\u0631\u0643\u0629 ( \u0645\u0647\u0646\u064a\u0629 ) . \n 4\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 5\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 6\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022\u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0627\u0637\u0631\u0627\u0641 . \n 8\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0631\u0629 . \n 9\u2022\u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0648\u062b\u0627\u0626\u0642 \u0648\u0637\u0628\u0627\u0639\u0629 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u0639\u0642\u062f \u0627\u0644\u0634\u0631\u0643\u0629 \u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a\u064b\u0627 .", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672d"} {"id": "503dc6645068-2", "text": "\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0627\u0644\u0645\u0647\u0646\u064a\u0629 ( \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629) 1200 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0627\u0644\u0645\u0647\u0646\u064a\u0629 (\u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 \u0648\u0627\u0644\u062a\u0636\u0627\u0645\u0646) 800 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0627\u0644\u0645\u0647\u0646\u064a\u0629 (\u0627\u0644\u0645\u0633\u0627\u0647\u0645\u0629) 1600 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 : 500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u064a\u0636\u0627\u0641 \u0639\u0644\u064a\u0647\u0627 \u0642\u064a\u0645\u0629 \u0627\u0644\u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% .\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a : \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=143/n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672d"} {"id": "03cfa28e4c27-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u0648\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 \u0644\u0644\u0634\u0631\u0643\u0629 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0641\u064a \u062d\u0627\u0644 \u062a\u0639\u062f\u064a\u0644 \u0645\u062c\u0644\u0633 \u0627\u0644\u0645\u062f\u064a\u0631\u064a\u0646 \u0623\u0648 \u0627\u0644\u0625\u062f\u0627\u0631\u0629 \u064a\u062c\u0628 \u0645\u0631\u0627\u0639\u0627\u0629 \u0627\u0644\u0646\u0635\u0627\u0628 \u0627\u0644\u0642\u0627\u0646\u0648\u0646\u064a . \n \u2022\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0628\u062f\u062e\u0648\u0644 \u0634\u0631\u064a\u0643 \u0645\u0647\u0646\u064a \u064a\u062c\u0628 \u0623\u0646 \u064a\u0643\u0648\u0646 \u0645\u0631\u062e\u0651\u064e\u0635 . \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0647\u064a\u0626\u0629 \u0627\u0644\u0632\u0643\u0627\u0629 \u0648 \u0627\u0644\u0636\u0631\u064a\u0628\u0629 \u0648 \u0627\u0644\u062c\u0645\u0627\u0631\u0643 . \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0639\u062f\u0644. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629. \n \u2022\u0641\u064a \u062d\u0627\u0644\u0629 \u0625\u0636\u0627\u0641\u0629 / \u062d\u0630\u0641 \u0628\u064a\u0627\u0646\u0627\u062a \u0634\u0631\u064a\u0643 \u0628\u0646\u0627\u0621\u064b \u0639\u0644\u0649 \u062d\u0643\u0645 \u0642\u0636\u0627\u0626\u064a \u064a\u062a\u0645 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0639\u0628\u0631 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644\u060c \u0648\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0635\u0643 \u0648\u0631\u062b\u0629 \u062a\u0643\u0648\u0646 \u0639\u0628\u0631 \u0645\u064f\u0639\u062a\u0645\u062f \u0627\u0644\u0648\u0632\u0627\u0631\u0629 \u0623\u0648 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644. \n \u2022\u0641\u064a \u062d\u0627\u0644 \u0625\u0636\u0627\u0641\u0629 \u0634\u0631\u064a\u0643 \u0642\u0627\u0635\u0631 \u064a\u062a\u0645 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0639\u0628\u0631 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672e"} {"id": "03cfa28e4c27-1", "text": "\u2022\u0625\u0631\u0641\u0627\u0642 \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0627\u0644\u0625\u0633\u062a\u062b\u0645\u0627\u0631\u064a \u0645\u0639 \u0645\u0631\u0627\u0639\u0627\u0629 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0641\u064a \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0623\u0648\u0644\u0627\u064b.\n\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u062e\u062f\u0645\u0629 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u062f \u0634\u0631\u0643\u0629 . \n 4\u2022 \u062a\u062d\u062f\u064a\u062f \u0633\u0628\u0628 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 . \n 5\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 6\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 8\u2022 \u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 9\u2022 \u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0629 . \n 10\u2022 \u062a\u062d\u062f\u064a\u062b \u0627\u0644\u0648\u062b\u0627\u0626\u0642 (\u0646\u0638\u0627\u0645 \u0627\u0644\u0623\u0633\u0627\u0633/\u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a ) . \n 11\u2022\u0645\u0644\u0627\u062d\u0638\u0629 : \n 12\u2022\u0628\u0625\u0645\u0643\u0627\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u062d\u0627\u0644\u0629 \u0627\u0644\u0637\u0644\u0628 ( \u0628\u0625\u0646\u062a\u0638\u0627\u0631 \u0627\u0644\u062a\u062d\u0642\u0642 \u0645\u0646 \u0627\u0644\u0628\u064a\u0627\u0646\u0627\u062a ) \u0628\u0625\u062a\u0628\u0627\u0639 \u0627\u0644\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062a\u0627\u0644\u064a\u0647 :", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672e"} {"id": "03cfa28e4c27-2", "text": "13\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 14\u2022\u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0628\u0631 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 15\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0637\u0644\u0628\u0627\u062a\u064a . \n 16\u2022\u0625\u0636\u0627\u0641\u0629 \u0631\u0642\u0645 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u0625\u064a\u0642\u0648\u0646\u0629 (\u0628\u062d\u062b \u0645\u062a\u0642\u062f\u0645 ) \u0641\u064a \u0623\u0639\u0644\u0649 \u0627\u0644\u0635\u0641\u062d\u0629 . \n 17\u2022\u0628\u0639\u062f \u0638\u0647\u0648\u0631 \u0627\u0644\u0637\u0644\u0628 \u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0625\u062c\u0631\u0627\u0621\u0627\u062a \u0648\u0627\u062e\u062a\u064a\u0627\u0631 ( \u062a\u0641\u0627\u0635\u064a\u0644 ) . \n 18\u2022\u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0631\u0633\u0648\u0645 \u0646\u0634\u0631 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0639\u0642\u062f: \n 1500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a\u060c \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% . \n \u0631\u0633\u0648\u0645 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a: \n 100 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a.\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=140", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672e"} {"id": "03cfa28e4c27-3", "text": "https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=140 \n * \u0639\u0646\u062f \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u0638\u0647\u0631 \u0644\u0644\u0639\u0645\u064a\u0644 \u0646\u0648\u0639 \u0627\u0644\u0643\u064a\u0627\u0646 \u0627\u0644\u0645\u0637\u0644\u0648\u0628 \u0639\u0646\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 ( \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629-\u0627\u0644\u062a\u0636\u0627\u0645\u0646-\u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629- \u0627\u0644\u0645\u0647\u0646\u064a\u0629 )/n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672e"} {"id": "196a7f6e6203-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u062d\u0648\u064a\u0644 ( \u0646\u0648\u0639 \u0627\u0644\u0645\u0646\u0634\u0623\u0629 ) \u0645\u0646 \u0634\u0631\u0643\u0629 \u0625\u0644\u0649 \u0645\u0624\u0633\u0633\u0629 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u062a\u0642\u062f\u064a\u0645 \u0642\u0631\u0627\u0631 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0628\u0627\u0644\u062a\u062d\u0648\u064a\u0644 \u0639\u0646 \u0637\u0631\u064a\u0642 \u0646\u0638\u0627\u0645 \u0642\u0631\u0627\u0631\u0627\u062a \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0627\u0644\u0633\u0627\u0628\u0642 \u0642\u064a \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a \u0644\u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629 . \n \u2022\u0623\u0644\u0627 \u064a\u0642\u0644 \u0639\u0645\u0631 \u0645\u0642\u062f\u0645 \u0627\u0644\u0637\u0644\u0628 \u0639\u0646 18 \u0633\u0646\u0629. \n \u2022\u0623\u0644\u0627 \u064a\u0643\u0648\u0646 \u0645\u0627\u0644\u0643 \u0627\u0644\u0645\u0624\u0633\u0633\u0629 \u0645\u0648\u0638\u0641 \u062d\u0643\u0648\u0645\u064a. \n \u2022\u0623\u0644\u0627 \u064a\u0643\u0648\u0646 \u0644\u062f\u0649 \u0645\u0627\u0644\u0643 \u0627\u0644\u0645\u0624\u0633\u0633\u0629 \u0633\u062c\u0644 \u062a\u062c\u0627\u0631\u064a \u0642\u0627\u0626\u0645 (\u0645\u0624\u0633\u0633\u0629 \u0642\u0627\u0626\u0645\u0629). \n \u2022\u0623\u0644\u0627 \u064a\u0643\u0648\u0646 \u0639\u0644\u0649 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0627\u0644\u0645\u0631\u0627\u062f \u062a\u062d\u0648\u064a\u0644\u0647 \u0623\u064a \u0625\u064a\u0642\u0627\u0641. \n \u2022\u0623\u0644\u0627\u064a\u0643\u0648\u0646 \u0627\u0644\u0633\u062c\u0644 \u0642\u0627\u0626\u0645 ( \u063a\u064a\u0631 \u0645\u0646\u062a\u0647\u064a ) . \n \u2022\u0639\u062f\u0645 \u0648\u062c\u0648\u062f \u0637\u0644\u0628\u0627\u062a \u0645\u0639\u0644\u0642\u0629 \u0639\u0644\u0649 \u0646\u0641\u0633 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a .", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056731"} {"id": "196a7f6e6203-1", "text": "\u2022\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0633\u0627\u0647\u0645\u0629 \u0623\u0648 \u0645\u0633\u0627\u0647\u0645\u0629 \u0645\u0628\u0633\u0637\u0629 \u064a\u062a\u0645 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u062e\u062f\u0645\u0629 \u0639\u0646 \u0637\u0631\u064a\u0642 \u0627\u0644\u0641\u0631\u0639 \u0627\u0644\u0631\u0642\u0645\u064a \u0645\u0639 \u062a\u0636\u0645\u064a\u0646 \u0625\u0631\u0641\u0627\u0642 \u0642\u0631\u0627\u0631 \u0627\u0644\u062c\u0645\u0639\u064a\u0629 \u0628\u0627\u0644\u062a\u062d\u0648\u064a\u0644 \u0628\u0639\u062f \u0646\u0634\u0631\u0647 \u0639\u0644\u0649 \u0646\u0638\u0627\u0645 \u0642\u0631\u0627\u0631 \u0627\u0644\u062c\u0645\u0639\u064a\u0627\u062a \u0627\u0644\u063a\u064a\u0631 \u0639\u0627\u062f\u064a\u0629.\n\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0625\u0644\u0649 \u0646\u0638\u0627\u0645 \u0642\u0631\u0627\u0631\u0627\u062a \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0627\u0644\u0633\u0627\u0628\u0642 \u0642\u064a \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a \u0644\u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629\u060c \u0648 \u062a\u0642\u062f\u064a\u0645 \u0637\u0644\u0628 \u062a\u062d\u0648\u064a\u0644\u060c \u062b\u0645 \u062a\u062a\u0645 \u062f\u0631\u0627\u0633\u0629 \u0627\u0644\u0637\u0644\u0628 \u0648 \u0627\u0639\u062a\u0645\u0627\u062f\u0647. \n 2\u2022\u0625\u0635\u062f\u0627\u0631 \u0641\u0627\u062a\u0648\u0631\u0629 \u0633\u062f\u0627\u062f . \n 3\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0631\u0633\u0648\u0645 \u0648 \u062a\u0648\u062b\u064a\u0642 \u0627\u0644\u0637\u0644\u0628 \u0644\u062f\u0649 \u0645\u0648\u0638\u0641 \u0627\u0644\u0648\u0632\u0627\u0631\u0629 \u0648\u0646\u0634\u0631 \u0627\u0644\u062a\u062d\u0648\u064a\u0644 \u0627\u0644\u0643\u062a\u0631\u0648\u0646\u064a\u064b\u0627. \n 4\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u0641\u0631\u0639 \u0627\u0644\u0631\u0642\u0645\u064a \u0648 \u062a\u0639\u0628\u0626\u0629 \u0646\u0645\u0648\u0630\u062c \u0637\u0644\u0628 \u0625\u0635\u062f\u0627\u0631 \u0633\u062c\u0644 \u062a\u062c\u0627\u0631\u064a. \n 5\u2022\u0625\u0631\u0633\u0627\u0644 \u0627\u0644\u0637\u0644\u0628 \u0648 \u0627\u0639\u062a\u0645\u0627\u062f\u0647. \n 6\u2022\u0628\u0639\u062f \u0633\u062f\u0627\u062f \u0627\u0644\u0631\u0633\u0648\u0645 \u064a\u062a\u0645 \u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0633\u062c\u0644.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056731"} {"id": "196a7f6e6203-2", "text": "\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: 1500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% .\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a \u0644\u0644\u062e\u062f\u0645\u0629 : \n \u0642\u0631\u0627\u0631 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 ( \u0644\u062a\u0642\u062f\u064a\u0645 \u0637\u0644\u0628 \u0627\u0644\u062a\u062d\u0648\u064a\u0644 ) \n https://mc.gov.sa/ar/eservices/Pages/ServiceDetails.aspx?sID=74 \n \u0627\u0644\u0641\u0631\u0639 \u0627\u0644\u0631\u0642\u0645\u064a ( \u0637\u0644\u0628 \u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a ) \n https://mc.gov.sa/ar/eservices/Pages/ServiceDetails.aspx?sID=29/n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056731"}
faranheit/ministries
[ "task_categories:question-answering", "language:ar", "license:apache-2.0", "not-for-all-audiences", "region:us" ]
2024-01-11T11:23:57+00:00
{"language": ["ar"], "license": "apache-2.0", "task_categories": ["question-answering"], "tags": ["not-for-all-audiences"]}
2024-01-11T11:39:25+00:00
[]
[ "ar" ]
TAGS #task_categories-question-answering #language-Arabic #license-apache-2.0 #not-for-all-audiences #region-us
{"id": "130042945016-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062c\u0645\u064a\u0639 \u0637\u0644\u0628\u0627\u062a \u0639\u0642\u0648\u062f \u062a\u0623\u0633\u064a\u0633 \u0627\u0644\u0634\u0631\u0643\u0627\u062a \u062d\u0633\u0628 \u0627\u0644\u0643\u064a\u0627\u0646 : \n \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u064f\u062a\u0645\u0651\u0643\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u064f\u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a\u060c \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0645\u062d\u062f\u0648\u062f\u0629 \u0645\u0646 \u0634\u062e\u0635 \n \u0648\u0627\u062d\u062f\u060c \u0623\u0648 \u0623\u0643\u062b\u0631 \u0645\u0646 \u0630\u0648\u064a \u0627\u0644\u0635\u0641\u0629 \u0627\u0644\u0637\u0628\u064a\u0639\u064a\u0629 \u0623\u0648 \u0627\u0644\u0625\u0639\u062a\u0628\u0627\u0631\u064a\u0629\u060c \u0648\u062a\u0639\u062f \u0630\u0645\u062a\u0647\u0627 \u0645\u0633\u062a\u0642\u0644\u0629 \u0639\u0646 \u0627\u0644\u0630\u0645\u0629 \u0627\u0644\u0645\u0627\u0644\u064a\u0629 \u0644\u0643\u0644 \u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627. \n \u0648\u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u062d\u062f\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0629 \u0639\u0646 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0645\u062a\u0631\u062a\u0628\u0629 \u0639\u0644\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0646\u0627\u0634\u0626\u0629 \u0639\u0646 \u0646\u0634\u0627\u0637\u0647\u0627\u060c \u0648\u0644\u0627 \u064a\u0643\u0648\u0646 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627 \u0648\u0644\u0627 \n \u0627\u0644\u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0622\u064b \u0639\u0646 \u0647\u0630\u0647 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0625\u0644\u0627 \u0628\u0642\u062f\u0631 \u062d\u0635\u062a\u0647 \u0641\u064a \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 . \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 :", "source": "URL {"id": "130042945016-1", "text": "\u0627\u0644\u062a\u0636\u0627\u0645\u0646 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0636\u0627\u0645\u0646\u064a\u0629 \u062c\u0645\u064a\u0639 \u0634\u0631\u0643\u0627\u0624\u0647\u0627 \u0623\u0641\u0631\u0627\u062f \n \u0648\u0645\u0633\u0624\u0648\u0644\u064a\u0646 \u0634\u062e\u0635\u064a\u0627\u064b \u0628\u062c\u0645\u064a\u0639 \u0623\u0645\u0648\u0627\u0644\u0647\u0645 \u0648\u0628\u0627\u0644\u062a\u0636\u0627\u0645\u0646 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a\u0647\u0627\u060c \n \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0648\u0635\u064a\u0629 \u0628\u0633\u064a\u0637\u0629 \u062a\u062a\u0643\u0648\u0646 \u0645\u0646 \u0641\u0631\u064a\u0642\u064a\u0646\u060c \n (\u0627\u0644\u0645\u062a\u0636\u0627\u0645\u0646) \u0648\u0647\u0648 \u0627\u0644\u0645\u0633\u0624\u0648\u0644 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0634\u0631\u0643\u0629\u060c \u0648(\u0627\u0644\u0645\u0648\u0635\u064a) \u0648\u0647\u0648 \u0627\u0644\u0630\u064a \u0644\u0627 \u064a\u0643\u0648\u0646 \u0645\u0633\u0624\u0648\u0644\u0627\u064b \u0625\u0644\u0627 \u0641\u064a \u062d\u062f\u0648\u062f \u062d\u0635\u062a\u0647 \u0641\u064a \n \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022 \u064a\u062c\u0628 \u0623\u0646 \u064a\u0643\u0648\u0646 \u0627\u0644\u0634\u062e\u0635 \u0627\u0644\u0637\u0628\u064a\u0639\u064a \u0623\u0643\u0628\u0631 \u0645\u0646 18 \u0639\u0627\u0645\u060c \u0648\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0642\u0627\u0635\u0631\u064b\u0627 \u064a\u062a\u0645 \u0625\u0631\u0641\u0627\u0642 \u0635\u0643 \u0627\u0644\u0648\u0644\u0627\u064a\u0629 .", "source": "URL {"id": "130042945016-2", "text": "\u2022 \u064a\u062c\u0628 \u0623\u0646 \u0623\u0644\u0627 \u064a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0645\u0648\u0638\u0641\u064a\u0646 \u062d\u0643\u0648\u0645\u064a\u064a\u0646 . \n \u2022 \u0627\u0644\u062a\u062d\u0642\u0651\u064f\u0642 \u0645\u0646 \u0642\u0627\u0639\u062f\u0629 \u0627\u0644\u0639\u0645\u0644 \u0627\u0644\u062e\u0627\u0635\u0629 \u0628\u0628\u0639\u0636 \u0639\u0648\u0627\u0626\u0644 \u0646\u062c\u0631\u0627\u0646 . \n \u2022 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0634\u0631\u064a\u0643 \u0625\u0639\u062a\u0628\u0627\u0631\u064a \u064a\u062a\u0645 \u0627\u0644\u062a\u062d\u0642\u0642 \u0645\u0646 \u0623\u0646 \u064a\u0643\u0648\u0646 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u063a\u064a\u0631 \u0645\u0634\u0637\u0648\u0628 \u0623\u0648 \u0645\u0648\u0642\u0648\u0641 \u0623\u0648 \u0645\u0646\u062a\u0647\u064a . \n \u2022 \u0641\u064a \u062d\u0627\u0644 \u0648\u062c\u0648\u062f \u0634\u0631\u064a\u0643 \u0623\u062c\u0646\u0628\u064a \u064a\u062c\u0628 \u0627\u0644\u062a\u062d\u0642\u0651\u064f\u0642 \u0645\u0646 \u0648\u062c\u0648\u062f \u0631\u062e\u0635\u0629 \u0625\u0633\u062a\u062b\u0645\u0627\u0631 \u0645\u0646 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0625\u0633\u062a\u062b\u0645\u0627\u0631 . \n \u2022 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u062c\u0647\u0629 \u062d\u0643\u0648\u0645\u064a\u0629/\u0645\u0624\u0633\u0633\u0629 \u0623\u0647\u0644\u064a\u0629/\u062c\u0645\u0639\u064a\u0629 \u062e\u064a\u0631\u064a\u0629/ \u0648\u0642\u0641 \"\u064a\u062c\u0628 \u0648\u062c\u0648\u062f \u0633\u0646\u062f \u0646\u0638\u0627\u0645\u064a \u064a\u062e\u0648\u0644\u0647\u0627 \u0628\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0623\u0648 \u0627\u0644\u0645\u0634\u0627\u0631\u0643\u0629 \u0641\u064a \u0634\u0631\u0643\u0629 \". \n \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0642\u0631\u064a\u0631 \u0627\u0644\u0645\u0642\u064a\u0645 \u0627\u0644\u0645\u0639\u062a\u0645\u062f \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0631\u0627\u0633 \u0627\u0644\u0645\u0627\u0644 \u0639\u064a\u0646\u064a. \n \u2022\u0627\u0644\u0645\u0633\u062a\u0646\u062f\u0627\u062a \u0627\u0644\u0645\u0637\u0644\u0648\u0628\u0629 : \n \u2022 \u0625\u0631\u0641\u0627\u0642 \u062a\u0631\u062e\u064a\u0635 \u0645\u0646 \u0627\u0644\u0628\u0646\u0643 \u0627\u0644\u0645\u0631\u0643\u0632\u064a \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0627\u0644\u0646\u0634\u0627\u0637 \u064a\u062a\u0637\u0644\u0628 \u0630\u0644\u0643.", "source": "URL {"id": "130042945016-3", "text": "\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0625\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC) . \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0627\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u0646\u0648\u0639 \u0648\u0635\u0641\u0629 \u0627\u0644\u0634\u0631\u0643\u0629 (\u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 - \u0627\u0644\u062a\u0636\u0627\u0645\u0646 - \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629) . \n 4\u2022\u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 5\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 6\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022\u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 8\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0631\u0629 . \n 9\u2022\u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0648\u062b\u0627\u0626\u0642 \u0648\u0637\u0628\u0627\u0639\u0629 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u0639\u0642\u062f \u0627\u0644\u0634\u0631\u0643\u0629 \u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : 1200 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 - \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629: 800 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 \u0644\u0644\u0643\u064a\u0627\u0646\u0627\u062a: 500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% .", "source": "URL {"id": "130042945016-4", "text": "\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : \n URL \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 : \n URL \n \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629: \n URL "source": "URL {"id": "952da374a2f2-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u064f\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062c\u0645\u064a\u0639 \u0637\u0644\u0628\u0627\u062a \u0639\u0642\u0648\u062f \u062a\u0623\u0633\u064a\u0633 \u0627\u0644\u0634\u0631\u0643\u0627\u062a \u0627\u0644\u0645\u0647\u0646\u064a\u0629 \u0628\u0645\u0648\u062c\u0628 \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0627\u0644\u0645\u0647\u0646\u064a \u0627\u0644\u0635\u0627\u062f\u0631 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u064a\u062a\u0645 \u0627\u0644\u062a\u062d\u0642\u0651\u064f\u0642 \u0645\u0646 \u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629 \u0644\u0644\u0634\u0631\u0643\u0627\u062a \u0627\u0644\u062a\u062c\u0627\u0631\u064a\u0629\u060c \u0628\u0627\u0644\u0625\u0636\u0627\u0641\u0629 \u0625\u0644\u0649 \u0627\u0644\u062a\u0627\u0644\u064a: \n \u2022\u064a\u062c\u0628 \u0623\u0646 \u062a\u062a\u0648\u0641\u0631 \u0631\u062e\u0635\u0629 \u0645\u0647\u0646\u064a\u0629 \u0633\u0627\u0631\u064a\u0629 \u0644\u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0648\u0642\u062a \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628\u060c \u0648\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u063a\u064a\u0631 \u0633\u0639\u0648\u062f\u064a \u064a\u062c\u0628 \u0625\u0631\u0641\u0627\u0642 \u062a\u0631\u062e\u064a\u0635 \u0645\u0647\u0646\u064a \u0635\u0627\u062f\u0631 \u0645\u0646 \u062f\u0627\u062e\u0644 \u0627\u0644\u0645\u0645\u0644\u0643\u0629. \n \u2022 \u0623\u0644\u0627 \u062a\u0642\u0644 \u0646\u0633\u0628\u0629 \u0627\u0644\u0634\u0631\u064a\u0643 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0627\u0644\u0645\u0631\u062e\u0651\u064e\u0635 \u0639\u0646 (25%) \u0645\u0646 \u0631\u0623\u0633 \u0645\u0627\u0644 \u0627\u0644\u0634\u0631\u0643\u0629 \u0627\u0644\u0645\u0647\u0646\u064a\u0629 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u062e\u062a\u0644\u0637\u0629. \n \u2022 \u0623\u0644\u0627 \u062a\u0642\u0644 \u0646\u0633\u0628\u0629 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0627\u0644\u0645\u0631\u062e\u0651\u064e\u0635\u064a\u0646 \u0639\u0646 (70%) \u0644\u0633\u0639\u0648\u062f\u064a \u0648\u0627\u0644\u062e\u0644\u064a\u062c\u064a . \n \u2022 \u0625\u0631\u0641\u0627\u0642 \u062a\u0642\u0631\u064a\u0631 \u0627\u0644\u0645\u0642\u064a\u0645 \u0627\u0644\u0645\u0639\u062a\u0645\u062f \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0631\u0627\u0633 \u0627\u0644\u0645\u0627\u0644 \u0639\u064a\u0646\u064a.", "source": "URL {"id": "952da374a2f2-1", "text": "\u2022\u0623\u0646\u0648\u0627\u0639 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0641\u064a \u0639\u0642\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 : \n \u2022\u0634\u0631\u064a\u0643 \u0645\u0631\u062e\u0635 . \n \u2022\u0634\u0631\u064a\u0643 \u0628\u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 . \n \u2022\u0634\u0631\u064a\u0643 \u0628\u0627\u0644\u0639\u0645\u0644 ,\n\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0625\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC) . \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u0646\u0648\u0639 \u0648\u0635\u0641\u0629 \u0627\u0644\u0634\u0631\u0643\u0629 (\u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629-\u0627\u0644\u062a\u0636\u0627\u0645\u0646-\u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629) \u0645\u0647\u0646\u064a\u0629 . \n 4\u2022\u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 5\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 6\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022\u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 8\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0631\u0629 . \n 9\u2022\u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0648\u062b\u0627\u0626\u0642 \u0648\u0637\u0628\u0627\u0639\u0629 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u0639\u0642\u062f \u0627\u0644\u0634\u0631\u0643\u0629 \u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629: 1200 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a.", "source": "URL {"id": "952da374a2f2-2", "text": "\u0627\u0644\u062a\u0636\u0627\u0645\u0646-\u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629: 800 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u064a\u0636\u0627\u0641 \u0639\u0644\u064a\u0647\u0627 500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u0631\u0633\u0648\u0645 \u0646\u0634\u0631 + 15% \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 .\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n \u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0627\u0644\u062a\u0636\u0627\u0645\u0646 : \n URL \n \u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629: \n URL \n \u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629: \n URL "source": "URL {"id": "fd131aa6e7d9-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062c\u0645\u064a\u0639 \u0637\u0644\u0628\u0627\u062a \u0639\u0642\u0648\u062f \u062a\u0623\u0633\u064a\u0633 \u0627\u0644\u0634\u0631\u0643\u0627\u062a \u062d\u0633\u0628 \u0627\u0644\u0643\u064a\u0627\u0646 : \n \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u064f\u062a\u0645\u0651\u0643\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u064f\u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a\u060c \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0645\u062d\u062f\u0648\u062f\u0629 \u0645\u0646 \u0634\u062e\u0635 \n \u0648\u0627\u062d\u062f\u060c \u0623\u0648 \u0623\u0643\u062b\u0631 \u0645\u0646 \u0630\u0648\u064a \u0627\u0644\u0635\u0641\u0629 \u0627\u0644\u0637\u0628\u064a\u0639\u064a\u0629 \u0623\u0648 \u0627\u0644\u0625\u0639\u062a\u0628\u0627\u0631\u064a\u0629\u060c \u0648\u062a\u0639\u062f \u0630\u0645\u062a\u0647\u0627 \u0645\u0633\u062a\u0642\u0644\u0629 \u0639\u0646 \u0627\u0644\u0630\u0645\u0629 \u0627\u0644\u0645\u0627\u0644\u064a\u0629 \u0644\u0643\u0644 \u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627. \n \u0648\u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u062d\u062f\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0629 \u0639\u0646 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0645\u062a\u0631\u062a\u0628\u0629 \u0639\u0644\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0646\u0627\u0634\u0626\u0629 \u0639\u0646 \u0646\u0634\u0627\u0637\u0647\u0627\u060c \u0648\u0644\u0627 \u064a\u0643\u0648\u0646 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627 \u0648\u0644\u0627 \n \u0627\u0644\u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0622\u064b \u0639\u0646 \u0647\u0630\u0647 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0625\u0644\u0627 \u0628\u0642\u062f\u0631 \u062d\u0635\u062a\u0647 \u0641\u064a \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 . \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 :", "source": "URL {"id": "fd131aa6e7d9-1", "text": "\u0627\u0644\u062a\u0636\u0627\u0645\u0646 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0636\u0627\u0645\u0646\u064a\u0629 \u062c\u0645\u064a\u0639 \u0634\u0631\u0643\u0627\u0624\u0647\u0627 \u0623\u0641\u0631\u0627\u062f \n \u0648\u0645\u0633\u0624\u0648\u0644\u064a\u0646 \u0634\u062e\u0635\u064a\u0627\u064b \u0628\u062c\u0645\u064a\u0639 \u0623\u0645\u0648\u0627\u0644\u0647\u0645 \u0648\u0628\u0627\u0644\u062a\u0636\u0627\u0645\u0646 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a\u0647\u0627\u060c \n \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0648\u0635\u064a\u0629 \u0628\u0633\u064a\u0637\u0629 \u062a\u062a\u0643\u0648\u0646 \u0645\u0646 \u0641\u0631\u064a\u0642\u064a\u0646\u060c \n (\u0627\u0644\u0645\u062a\u0636\u0627\u0645\u0646) \u0648\u0647\u0648 \u0627\u0644\u0645\u0633\u0624\u0648\u0644 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0634\u0631\u0643\u0629\u060c \u0648(\u0627\u0644\u0645\u0648\u0635\u064a) \u0648\u0647\u0648 \u0627\u0644\u0630\u064a \u0644\u0627 \u064a\u0643\u0648\u0646 \u0645\u0633\u0624\u0648\u0644\u0627\u064b \u0625\u0644\u0627 \u0641\u064a \u062d\u062f\u0648\u062f \u062d\u0635\u062a\u0647 \u0641\u064a \n \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0631\u062e\u064a\u0635 \u0645\u0646 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0627\u0633\u062a\u062b\u0645\u0627\u0631. \n \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0642\u0631\u064a\u0631 \u0627\u0644\u0645\u0642\u064a\u0645 \u0627\u0644\u0645\u0639\u062a\u0645\u062f \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0631\u0627\u0633 \u0627\u0644\u0645\u0627\u0644 \u0639\u064a\u0646\u064a.", "source": "URL {"id": "fd131aa6e7d9-2", "text": "\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC) . \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0627\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u0646\u0648\u0639 \u0648\u0635\u0641\u0629 \u0627\u0644\u0634\u0631\u0643\u0629 ( \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 - \u0627\u0644\u062a\u0636\u0627\u0645\u0646 - \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629) \u060c \u0623\u062c\u0646\u0628\u064a\u0629 \n 4\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 5\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 6\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022\u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0627\u0637\u0631\u0627\u0641 . \n 8\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0631\u0629 . \n 9\u2022\u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0648\u062b\u0627\u0626\u0642 \u0648\u0637\u0628\u0627\u0639\u0629 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u0639\u0642\u062f \u0627\u0644\u0634\u0631\u0643\u0629 \u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a\u064b\u0627 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : 1200 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 : 500 \u0648 \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 : 15%. \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 \u0648 \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 : 800 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a.", "source": "URL {"id": "fd131aa6e7d9-3", "text": "\u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 : 500 \u0648\u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 : 15%.\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n URL \n * \u0639\u0646\u062f \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u0638\u0647\u0631 \u0644\u0644\u0639\u0645\u0644 \u0646\u0648\u0639 \u0627\u0644\u0643\u064a\u0627\u0646 \u0627\u0644\u0645\u0637\u0644\u0648\u0628 \u0639\u0646\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 : (\u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629-\u0627\u0644\u062a\u0636\u0627\u0645\u0646-\u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629) \u060c \u0623\u062c\u0646\u0628\u064a\u0629 /n", "source": "URL {"id": "3742980811a3-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062c\u0645\u064a\u0639 \u0637\u0644\u0628\u0627\u062a \u0639\u0642\u0648\u062f \u062a\u0623\u0633\u064a\u0633 \u0627\u0644\u0634\u0631\u0643\u0627\u062a \u062d\u0633\u0628 \u0627\u0644\u0643\u064a\u0627\u0646 : \n \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u064f\u062a\u0645\u0651\u0643\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u064f\u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a\u060c \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0645\u062d\u062f\u0648\u062f\u0629 \u0645\u0646 \u0634\u062e\u0635 \n \u0648\u0627\u062d\u062f\u060c \u0623\u0648 \u0623\u0643\u062b\u0631 \u0645\u0646 \u0630\u0648\u064a \u0627\u0644\u0635\u0641\u0629 \u0627\u0644\u0637\u0628\u064a\u0639\u064a\u0629 \u0623\u0648 \u0627\u0644\u0625\u0639\u062a\u0628\u0627\u0631\u064a\u0629\u060c \u0648\u062a\u0639\u062f \u0630\u0645\u062a\u0647\u0627 \u0645\u0633\u062a\u0642\u0644\u0629 \u0639\u0646 \u0627\u0644\u0630\u0645\u0629 \u0627\u0644\u0645\u0627\u0644\u064a\u0629 \u0644\u0643\u0644 \u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627. \n \u0648\u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u062d\u062f\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0629 \u0639\u0646 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0645\u062a\u0631\u062a\u0628\u0629 \u0639\u0644\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0646\u0627\u0634\u0626\u0629 \u0639\u0646 \u0646\u0634\u0627\u0637\u0647\u0627\u060c \u0648\u0644\u0627 \u064a\u0643\u0648\u0646 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627 \u0648\u0644\u0627 \n \u0627\u0644\u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0622\u064b \u0639\u0646 \u0647\u0630\u0647 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0625\u0644\u0627 \u0628\u0642\u062f\u0631 \u062d\u0635\u062a\u0647 \u0641\u064a \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 . \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 :", "source": "URL {"id": "3742980811a3-1", "text": "\u0627\u0644\u062a\u0636\u0627\u0645\u0646 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0636\u0627\u0645\u0646\u064a\u0629 \u062c\u0645\u064a\u0639 \u0634\u0631\u0643\u0627\u0624\u0647\u0627 \u0623\u0641\u0631\u0627\u062f \n \u0648\u0645\u0633\u0624\u0648\u0644\u064a\u0646 \u0634\u062e\u0635\u064a\u0627\u064b \u0628\u062c\u0645\u064a\u0639 \u0623\u0645\u0648\u0627\u0644\u0647\u0645 \u0648\u0628\u0627\u0644\u062a\u0636\u0627\u0645\u0646 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a\u0647\u0627\u060c \n \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0648\u0635\u064a\u0629 \u0628\u0633\u064a\u0637\u0629 \u062a\u062a\u0643\u0648\u0646 \u0645\u0646 \u0641\u0631\u064a\u0642\u064a\u0646\u060c \n (\u0627\u0644\u0645\u062a\u0636\u0627\u0645\u0646) \u0648\u0647\u0648 \u0627\u0644\u0645\u0633\u0624\u0648\u0644 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0634\u0631\u0643\u0629\u060c \u0648(\u0627\u0644\u0645\u0648\u0635\u064a) \u0648\u0647\u0648 \u0627\u0644\u0630\u064a \u0644\u0627 \u064a\u0643\u0648\u0646 \u0645\u0633\u0624\u0648\u0644\u0627\u064b \u0625\u0644\u0627 \u0641\u064a \u062d\u062f\u0648\u062f \u062d\u0635\u062a\u0647 \u0641\u064a \n \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0631\u062e\u064a\u0635 \u0645\u0646 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0627\u0633\u062a\u062b\u0645\u0627\u0631. \n \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0642\u0631\u064a\u0631 \u0627\u0644\u0645\u0642\u064a\u0645 \u0627\u0644\u0645\u0639\u062a\u0645\u062f \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0631\u0627\u0633 \u0627\u0644\u0645\u0627\u0644 \u0639\u064a\u0646\u064a.", "source": "URL {"id": "3742980811a3-2", "text": "\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC) . \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u0646\u0648\u0639 \u0648\u0635\u0641\u0629 \u0627\u0644\u0634\u0631\u0643\u0629 ( \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 - \u0627\u0644\u062a\u0636\u0627\u0645\u0646 - \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 ) . \u0645\u062e\u062a\u0644\u0637\u0629 . \n 4\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 5\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 6\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022\u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0627\u0637\u0631\u0627\u0641 . \n 8\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0631\u0629 . \n 9\u2022\u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0648\u062b\u0627\u0626\u0642 \u0648\u0637\u0628\u0627\u0639\u0629 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u0639\u0642\u062f \u0627\u0644\u0634\u0631\u0643\u0629 \u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a\u064b\u0627 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629: 1200 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 \u0648\u0627\u0644\u062a\u0636\u0627\u0645\u0646: 800 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a.", "source": "URL {"id": "3742980811a3-3", "text": "\u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 \u0644\u0644\u0643\u064a\u0627\u0646\u0627\u062a: 500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% .\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a : \n URL \n \u0639\u0646\u062f \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u0638\u0647\u0631 \u0644\u0644\u0639\u0645\u0644 \u0646\u0648\u0639 \u0627\u0644\u0643\u064a\u0627\u0646 \u0627\u0644\u0645\u0637\u0644\u0648\u0628 \u0639\u0646\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 : (\u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629-\u0627\u0644\u062a\u0636\u0627\u0645\u0646\u064a\u0629-\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629) \u060c \u0645\u062e\u062a\u0644\u0637\u0629 /n", "source": "URL {"id": "d1ca31729b6e-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u0648\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 \u0644\u0644\u0634\u0631\u0643\u0629 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0647\u064a\u0626\u0629 \u0627\u0644\u0632\u0643\u0627\u0629 \u0648 \u0627\u0644\u0636\u0631\u064a\u0628\u0629 \u0648 \u0627\u0644\u062c\u0645\u0627\u0631\u0643 . \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0639\u062f\u0644. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629. \n \u2022 \u0645\u0644\u0627\u062d\u0638\u0629: \n \u2022 \u0641\u064a \u062d\u0627\u0644\u0629 \u0625\u0636\u0627\u0641\u0629 / \u062d\u0630\u0641 \u0628\u064a\u0627\u0646\u0627\u062a \u0634\u0631\u064a\u0643 \u0628\u0646\u0627\u0621\u064b \u0639\u0644\u0649 \u062d\u0643\u0645 \u0642\u0636\u0627\u0626\u064a \u0623\u0648 \u0635\u0643 \u0648\u0631\u062b\u0629 \u0623\u0648 \u0648\u062c\u0648\u062f \u0634\u0631\u064a\u0643 \u0642\u0627\u0635\u0631 \u062a\u0643\u0648\u0646 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0645\u0646 \u062e\u0644\u0627\u0644 \u0645\u064f\u0639\u062a\u0645\u062f \u0627\u0644\u0648\u0632\u0627\u0631\u0629 \u0623\u0648 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644.\n\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a .", "source": "URL {"id": "d1ca31729b6e-1", "text": "3\u2022 \u062a\u062d\u062f\u064a\u062f \u062e\u062f\u0645\u0629 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u062f \u0634\u0631\u0643\u0629 . \n 4\u2022 \u062a\u062d\u062f\u064a\u062f \u0633\u0628\u0628 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 . \n 5\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 6\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 8\u2022 \u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 9\u2022 \u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0629 . \n 10\u2022 \u062a\u062d\u062f\u064a\u062b \u0627\u0644\u0648\u062b\u0627\u0626\u0642 (\u0646\u0638\u0627\u0645 \u0627\u0644\u0623\u0633\u0627\u0633/\u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a ) . \n 11\u2022\u0645\u0644\u0627\u062d\u0638\u0629 : \n 12\u2022\u0628\u0625\u0645\u0643\u0627\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u062d\u0627\u0644\u0629 \u0627\u0644\u0637\u0644\u0628 ( \u0628\u0625\u0646\u062a\u0638\u0627\u0631 \u0627\u0644\u062a\u062d\u0642\u0642 \u0645\u0646 \u0627\u0644\u0628\u064a\u0627\u0646\u0627\u062a ) \u0628\u0625\u062a\u0628\u0627\u0639 \u0627\u0644\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062a\u0627\u0644\u064a\u0647 : \n 13\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 14\u2022\u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0628\u0631 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 15\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0637\u0644\u0628\u0627\u062a\u064a . \n 16\u2022\u0625\u0636\u0627\u0641\u0629 \u0631\u0642\u0645 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u0625\u064a\u0642\u0648\u0646\u0629 (\u0628\u062d\u062b \u0645\u062a\u0642\u062f\u0645 ) \u0641\u064a \u0623\u0639\u0644\u0649 \u0627\u0644\u0635\u0641\u062d\u0629 .", "source": "URL {"id": "d1ca31729b6e-2", "text": "17\u2022\u0628\u0639\u062f \u0638\u0647\u0648\u0631 \u0627\u0644\u0637\u0644\u0628 \u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0625\u062c\u0631\u0627\u0621\u0627\u062a \u0648\u0627\u062e\u062a\u064a\u0627\u0631 ( \u062a\u0641\u0627\u0635\u064a\u0644 ) . \n 18\u2022\u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0631\u0633\u0648\u0645 \u0646\u0634\u0631 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0639\u0642\u062f : 1500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a . \n \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% . \n \u0631\u0633\u0648\u0645 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a: 100 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a .\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n URL "source": "URL {"id": "a087ade56ad1-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u0648\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 \u0644\u0644\u0634\u0631\u0643\u0629 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0641\u064a \u062d\u0627\u0644 \u062a\u0639\u062f\u064a\u0644 \u0645\u062c\u0644\u0633 \u0627\u0644\u0645\u062f\u064a\u0631\u064a\u0646 \u0623\u0648 \u0627\u0644\u0625\u062f\u0627\u0631\u0629 \u064a\u062c\u0628 \u0645\u0631\u0627\u0639\u0627\u0629 \u0627\u0644\u0646\u0635\u0627\u0628 \u0627\u0644\u0642\u0627\u0646\u0648\u0646\u064a . \n \u2022\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0628\u062f\u062e\u0648\u0644 \u0634\u0631\u064a\u0643 \u0645\u0647\u0646\u064a \u064a\u062c\u0628 \u0623\u0646 \u064a\u0643\u0648\u0646 \u0645\u0631\u062e\u0651\u064e\u0635. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0647\u064a\u0626\u0629 \u0627\u0644\u0632\u0643\u0627\u0629 \u0648\u0627\u0644\u0636\u0631\u064a\u0628\u0629 \u0648\u0627\u0644\u062c\u0645\u0627\u0631\u0643. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0639\u062f\u0644 . \n \u2022\u0623\u0646 \u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629 . \n \u2022\u0645\u0644\u0627\u062d\u0638\u0629: \n \u2022\u0641\u064a \u062d\u0627\u0644\u0629 \u0625\u0636\u0627\u0641\u0629 / \u062d\u0630\u0641 \u0628\u064a\u0627\u0646\u0627\u062a \u0634\u0631\u064a\u0643 \u0628\u0646\u0627\u0621\u064b \u0639\u0644\u0649 \u062d\u0643\u0645 \u0642\u0636\u0627\u0626\u064a \u0623\u0648 \u0635\u0643 \u0648\u0631\u062b\u0629 \u0623\u0648 \u0648\u062c\u0648\u062f \u0634\u0631\u064a\u0643 \u0642\u0627\u0635\u0631 \u062a\u0643\u0648\u0646 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0645\u0646 \u062e\u0644\u0627\u0644 \u0645\u064f\u0639\u062a\u0645\u062f \u0627\u0644\u0648\u0632\u0627\u0631\u0629 \u0623\u0648 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644.", "source": "URL {"id": "a087ade56ad1-1", "text": "\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u062e\u062f\u0645\u0629 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u062f \u0634\u0631\u0643\u0629 . \n 4\u2022 \u062a\u062d\u062f\u064a\u062f \u0633\u0628\u0628 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 . \n 5\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 6\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 8\u2022 \u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 9\u2022 \u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0629 . \n 10\u2022 \u062a\u062d\u062f\u064a\u062b \u0627\u0644\u0648\u062b\u0627\u0626\u0642 (\u0646\u0638\u0627\u0645 \u0627\u0644\u0623\u0633\u0627\u0633/\u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a ) . \n 11\u2022\u0645\u0644\u0627\u062d\u0638\u0629 : \n 12\u2022\u0628\u0625\u0645\u0643\u0627\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u062d\u0627\u0644\u0629 \u0627\u0644\u0637\u0644\u0628 ( \u0628\u0625\u0646\u062a\u0638\u0627\u0631 \u0627\u0644\u062a\u062d\u0642\u0642 \u0645\u0646 \u0627\u0644\u0628\u064a\u0627\u0646\u0627\u062a ) \u0625\u062a\u0628\u0627\u0639 \u0627\u0644\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062a\u0627\u0644\u064a\u0647 : \n 13\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC).", "source": "URL {"id": "a087ade56ad1-2", "text": "14\u2022\u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0628\u0631 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 15\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0637\u0644\u0628\u0627\u062a\u064a . \n 16\u2022\u0625\u0636\u0627\u0641\u0629 \u0631\u0642\u0645 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u0625\u064a\u0642\u0648\u0646\u0629 (\u0628\u062d\u062b \u0645\u062a\u0642\u062f\u0645 ) \u0641\u064a \u0623\u0639\u0644\u0649 \u0627\u0644\u0635\u0641\u062d\u0629 . \n 17\u2022\u0628\u0639\u062f \u0638\u0647\u0648\u0631 \u0627\u0644\u0637\u0644\u0628 \u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0625\u062c\u0631\u0627\u0621\u0627\u062a \u0648\u0623\u062e\u062a\u064a\u0627\u0631 ( \u062a\u0641\u0627\u0635\u064a\u0644 ) . \n 18\u2022\u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0631\u0633\u0648\u0645 \u0646\u0634\u0631 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0639\u0642\u062f : 1500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a\u060c \u064a\u0636\u0627\u0641 \u0639\u0644\u064a\u0647\u0627 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% . \n \u0631\u0633\u0648\u0645 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a : 100 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a..\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n URL "source": "URL {"id": "9a1a18440fc9-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u0648\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 \u0644\u0644\u0634\u0631\u0643\u0629 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0647\u064a\u0626\u0629 \u0627\u0644\u0632\u0643\u0627\u0629 \u0648 \u0627\u0644\u0636\u0631\u064a\u0628\u0629 \u0648\u0627\u0644\u062c\u0645\u0627\u0631\u0643. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0639\u062f\u0644. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629. \n \u2022\u0641\u064a \u062d\u0627\u0644\u0629 \u0625\u0636\u0627\u0641\u0629 / \u062d\u0630\u0641 \u0628\u064a\u0627\u0646\u0627\u062a \u0634\u0631\u064a\u0643 \u0628\u0646\u0627\u0621\u064b \u0639\u0644\u0649 \u062d\u0643\u0645 \u0642\u0636\u0627\u0626\u064a \u064a\u062a\u0645 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0639\u0628\u0631 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644\u060c \u0648\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0635\u0643 \u0648\u0631\u062b\u0629 \u062a\u0643\u0648\u0646 \u0639\u0628\u0631 \u0645\u064f\u0639\u062a\u0645\u062f \u0627\u0644\u0648\u0632\u0627\u0631\u0629 \u0623\u0648 \u0643\u0627\u062a\u0628 \u0639\u062f\u0644. \n \u2022\u0641\u064a \u062d\u0627\u0644 \u0625\u0636\u0627\u0641\u0629 \u0634\u0631\u064a\u0643 \u0642\u0627\u0635\u0631 \u064a\u062a\u0645 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0639\u0628\u0631 \u0643\u0627\u062a\u0628 \u0639\u062f\u0644. \n \u2022\u0625\u0631\u0641\u0627\u0642 \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0627\u0644\u0625\u0633\u062a\u062b\u0645\u0627\u0631\u064a \u0645\u0639 \u0645\u0631\u0627\u0639\u0627\u0629 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0641\u064a \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0623\u0648\u0644\u0627\u064b.", "source": "URL {"id": "9a1a18440fc9-1", "text": "\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0627\u0644\u0643\u062a\u0631\u0648\u0646\u064a \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u062e\u062f\u0645\u0629 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u062f \u0634\u0631\u0643\u0629 . \n 4\u2022 \u062a\u062d\u062f\u064a\u062f \u0633\u0628\u0628 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 . \n 5\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 6\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 8\u2022 \u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 9\u2022 \u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0629 . \n 10\u2022 \u062a\u062d\u062f\u064a\u062b \u0627\u0644\u0648\u062b\u0627\u0626\u0642 (\u0646\u0638\u0627\u0645 \u0627\u0644\u0623\u0633\u0627\u0633/\u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a ) . \n 11\u2022\u0645\u0644\u0627\u062d\u0638\u0629 : \n 12\u2022\u0628\u0625\u0645\u0643\u0627\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u062d\u0627\u0644\u0629 \u0627\u0644\u0637\u0644\u0628 ( \u0628\u0625\u0646\u062a\u0638\u0627\u0631 \u0627\u0644\u062a\u062d\u0642\u0642 \u0645\u0646 \u0627\u0644\u0628\u064a\u0627\u0646\u0627\u062a ) \u0628\u0625\u062a\u0628\u0627\u0639 \u0627\u0644\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062a\u0627\u0644\u064a\u0647 : \n 13\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC).", "source": "URL {"id": "9a1a18440fc9-2", "text": "14\u2022\u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0628\u0631 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 15\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0637\u0644\u0628\u0627\u062a\u064a . \n 16\u2022\u0625\u0636\u0627\u0641\u0629 \u0631\u0642\u0645 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u0625\u064a\u0642\u0648\u0646\u0629 (\u0628\u062d\u062b \u0645\u062a\u0642\u062f\u0645 ) \u0641\u064a \u0623\u0639\u0644\u0649 \u0627\u0644\u0635\u0641\u062d\u0629 . \n 17\u2022\u0628\u0639\u062f \u0638\u0647\u0648\u0631 \u0627\u0644\u0637\u0644\u0628 \u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0625\u062c\u0631\u0627\u0621\u0627\u062a \u0648\u0627\u062e\u062a\u064a\u0627\u0631 ( \u062a\u0641\u0627\u0635\u064a\u0644 ) . \n 18\u2022\u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0631\u0633\u0648\u0645 \u0646\u0634\u0631 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0639\u0642\u062f : \n 1500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% . \n \u0631\u0633\u0648\u0645 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a : \n 100 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a.\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n URL "source": "URL {"id": "503dc6645068-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0623\u0633\u064a\u0633 \u0627\u0644\u0634\u0631\u0643\u0627\u062a \u0627\u0644\u0645\u0647\u0646\u064a\u0629 \u0627\u0644\u0635\u0627\u062f\u0631\u0629 \u0628\u0645\u0648\u062c\u0628 \u062a\u0631\u062e\u064a\u0635 \u0645\u0647\u0646\u064a \u0648 \u0645\u0632\u0627\u0648\u0644\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0628\u0623\u0646\u0648\u0627\u0639\u0647\u0627: (\u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629- \u0627\u0644\u062a\u0636\u0627\u0645\u0646 \u2013 \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629- \u0627\u0644\u0645\u0633\u0627\u0647\u0645\u0629)\u00a0 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u064a\u062c\u0628 \u0623\u0646 \u062a\u062a\u0648\u0641\u0631 \u0631\u062e\u0635\u0629 \u0645\u0647\u0646\u064a\u0629 \u0633\u0627\u0631\u064a\u0629 \u0644\u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0648\u0642\u062a \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n \u2022\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u063a\u064a\u0631 \u0633\u0639\u0648\u062f\u064a\u061b \u064a\u062c\u0628 \u0625\u0631\u0641\u0627\u0642 \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0627\u0644\u0645\u0647\u0646\u064a \u0627\u0644\u0635\u0627\u062f\u0631 \u0645\u0646 \u062f\u0627\u062e\u0644 \u0627\u0644\u0645\u0645\u0644\u0643\u0629. \n \u2022\u0623\u0646 \u0644\u0627 \u062a\u0642\u0644 \u0646\u0633\u0628\u0629 \u0627\u0644\u0634\u0631\u064a\u0643 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0627\u0644\u0645\u0631\u062e\u0651\u064e\u0635 \u0639\u0646 (25%) \u0645\u0646 \u0631\u0623\u0633 \u0645\u0627\u0644 \u0627\u0644\u0634\u0631\u0643\u0629 \u0627\u0644\u0645\u0647\u0646\u064a\u0629 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u062e\u062a\u0644\u0637\u0629 . \n \u2022\u0623\u0646 \u0644\u0627 \u062a\u0642\u0644 \u0646\u0633\u0628\u0629 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0627\u0644\u0645\u0631\u062e\u0651\u064e\u0635\u064a\u0646 \u0639\u0646 (70%) \u0644\u0633\u0639\u0648\u062f\u064a \u0648\u0627\u0644\u062e\u0644\u064a\u062c\u064a . \n \u2022\u0648\u062c\u0648\u062f \u062a\u0631\u062e\u064a\u0635 \u0627\u0633\u062a\u062b\u0645\u0627\u0631\u064a .", "source": "URL {"id": "503dc6645068-1", "text": "\u2022\u0648\u062c\u0648\u062f \u062a\u0631\u062e\u064a\u0635 \u0627\u0633\u062a\u062b\u0645\u0627\u0631\u064a . \n \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0642\u0631\u064a\u0631 \u0627\u0644\u0645\u0642\u064a\u0645 \u0627\u0644\u0645\u0639\u062a\u0645\u062f \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0631\u0627\u0633 \u0627\u0644\u0645\u0627\u0644 \u0639\u064a\u0646\u064a. \n \u2022\u0645\u0644\u0627\u062d\u0638\u0629 : \n \u2022\u0623\u0646\u0648\u0627\u0639 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0641\u064a \u0627\u0644\u0639\u0642\u062f : \n \u2022\u0634\u0631\u064a\u0643 \u0645\u0631\u062e\u0635. \n \u2022\u0634\u0631\u064a\u0643 \u0628\u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 . \n \u2022\u0634\u0631\u064a\u0643 \u0628\u0627\u0644\u0639\u0645\u0644.\n\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0625\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC) . \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u0646\u0648\u0639 \u0648\u0635\u0641\u0629 \u0627\u0644\u0634\u0631\u0643\u0629 ( \u0645\u0647\u0646\u064a\u0629 ) . \n 4\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 5\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 6\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022\u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0627\u0637\u0631\u0627\u0641 . \n 8\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0631\u0629 . \n 9\u2022\u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0648\u062b\u0627\u0626\u0642 \u0648\u0637\u0628\u0627\u0639\u0629 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u0639\u0642\u062f \u0627\u0644\u0634\u0631\u0643\u0629 \u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a\u064b\u0627 .", "source": "URL {"id": "503dc6645068-2", "text": "\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0627\u0644\u0645\u0647\u0646\u064a\u0629 ( \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629) 1200 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0627\u0644\u0645\u0647\u0646\u064a\u0629 (\u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 \u0648\u0627\u0644\u062a\u0636\u0627\u0645\u0646) 800 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0627\u0644\u0645\u0647\u0646\u064a\u0629 (\u0627\u0644\u0645\u0633\u0627\u0647\u0645\u0629) 1600 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 : 500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u064a\u0636\u0627\u0641 \u0639\u0644\u064a\u0647\u0627 \u0642\u064a\u0645\u0629 \u0627\u0644\u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% .\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a : \n URL "source": "URL {"id": "03cfa28e4c27-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u0648\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 \u0644\u0644\u0634\u0631\u0643\u0629 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0641\u064a \u062d\u0627\u0644 \u062a\u0639\u062f\u064a\u0644 \u0645\u062c\u0644\u0633 \u0627\u0644\u0645\u062f\u064a\u0631\u064a\u0646 \u0623\u0648 \u0627\u0644\u0625\u062f\u0627\u0631\u0629 \u064a\u062c\u0628 \u0645\u0631\u0627\u0639\u0627\u0629 \u0627\u0644\u0646\u0635\u0627\u0628 \u0627\u0644\u0642\u0627\u0646\u0648\u0646\u064a . \n \u2022\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0628\u062f\u062e\u0648\u0644 \u0634\u0631\u064a\u0643 \u0645\u0647\u0646\u064a \u064a\u062c\u0628 \u0623\u0646 \u064a\u0643\u0648\u0646 \u0645\u0631\u062e\u0651\u064e\u0635 . \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0647\u064a\u0626\u0629 \u0627\u0644\u0632\u0643\u0627\u0629 \u0648 \u0627\u0644\u0636\u0631\u064a\u0628\u0629 \u0648 \u0627\u0644\u062c\u0645\u0627\u0631\u0643 . \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0639\u062f\u0644. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629. \n \u2022\u0641\u064a \u062d\u0627\u0644\u0629 \u0625\u0636\u0627\u0641\u0629 / \u062d\u0630\u0641 \u0628\u064a\u0627\u0646\u0627\u062a \u0634\u0631\u064a\u0643 \u0628\u0646\u0627\u0621\u064b \u0639\u0644\u0649 \u062d\u0643\u0645 \u0642\u0636\u0627\u0626\u064a \u064a\u062a\u0645 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0639\u0628\u0631 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644\u060c \u0648\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0635\u0643 \u0648\u0631\u062b\u0629 \u062a\u0643\u0648\u0646 \u0639\u0628\u0631 \u0645\u064f\u0639\u062a\u0645\u062f \u0627\u0644\u0648\u0632\u0627\u0631\u0629 \u0623\u0648 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644. \n \u2022\u0641\u064a \u062d\u0627\u0644 \u0625\u0636\u0627\u0641\u0629 \u0634\u0631\u064a\u0643 \u0642\u0627\u0635\u0631 \u064a\u062a\u0645 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0639\u0628\u0631 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644.", "source": "URL {"id": "03cfa28e4c27-1", "text": "\u2022\u0625\u0631\u0641\u0627\u0642 \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0627\u0644\u0625\u0633\u062a\u062b\u0645\u0627\u0631\u064a \u0645\u0639 \u0645\u0631\u0627\u0639\u0627\u0629 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0641\u064a \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0623\u0648\u0644\u0627\u064b.\n\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u062e\u062f\u0645\u0629 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u062f \u0634\u0631\u0643\u0629 . \n 4\u2022 \u062a\u062d\u062f\u064a\u062f \u0633\u0628\u0628 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 . \n 5\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 6\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 8\u2022 \u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 9\u2022 \u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0629 . \n 10\u2022 \u062a\u062d\u062f\u064a\u062b \u0627\u0644\u0648\u062b\u0627\u0626\u0642 (\u0646\u0638\u0627\u0645 \u0627\u0644\u0623\u0633\u0627\u0633/\u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a ) . \n 11\u2022\u0645\u0644\u0627\u062d\u0638\u0629 : \n 12\u2022\u0628\u0625\u0645\u0643\u0627\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u062d\u0627\u0644\u0629 \u0627\u0644\u0637\u0644\u0628 ( \u0628\u0625\u0646\u062a\u0638\u0627\u0631 \u0627\u0644\u062a\u062d\u0642\u0642 \u0645\u0646 \u0627\u0644\u0628\u064a\u0627\u0646\u0627\u062a ) \u0628\u0625\u062a\u0628\u0627\u0639 \u0627\u0644\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062a\u0627\u0644\u064a\u0647 :", "source": "URL {"id": "03cfa28e4c27-2", "text": "13\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 14\u2022\u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0628\u0631 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 15\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0637\u0644\u0628\u0627\u062a\u064a . \n 16\u2022\u0625\u0636\u0627\u0641\u0629 \u0631\u0642\u0645 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u0625\u064a\u0642\u0648\u0646\u0629 (\u0628\u062d\u062b \u0645\u062a\u0642\u062f\u0645 ) \u0641\u064a \u0623\u0639\u0644\u0649 \u0627\u0644\u0635\u0641\u062d\u0629 . \n 17\u2022\u0628\u0639\u062f \u0638\u0647\u0648\u0631 \u0627\u0644\u0637\u0644\u0628 \u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0625\u062c\u0631\u0627\u0621\u0627\u062a \u0648\u0627\u062e\u062a\u064a\u0627\u0631 ( \u062a\u0641\u0627\u0635\u064a\u0644 ) . \n 18\u2022\u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0631\u0633\u0648\u0645 \u0646\u0634\u0631 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0639\u0642\u062f: \n 1500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a\u060c \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% . \n \u0631\u0633\u0648\u0645 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a: \n 100 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a.\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n URL "source": "URL {"id": "03cfa28e4c27-3", "text": "URL \n * \u0639\u0646\u062f \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u0638\u0647\u0631 \u0644\u0644\u0639\u0645\u064a\u0644 \u0646\u0648\u0639 \u0627\u0644\u0643\u064a\u0627\u0646 \u0627\u0644\u0645\u0637\u0644\u0648\u0628 \u0639\u0646\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 ( \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629-\u0627\u0644\u062a\u0636\u0627\u0645\u0646-\u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629- \u0627\u0644\u0645\u0647\u0646\u064a\u0629 )/n", "source": "URL {"id": "196a7f6e6203-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u062d\u0648\u064a\u0644 ( \u0646\u0648\u0639 \u0627\u0644\u0645\u0646\u0634\u0623\u0629 ) \u0645\u0646 \u0634\u0631\u0643\u0629 \u0625\u0644\u0649 \u0645\u0624\u0633\u0633\u0629 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u062a\u0642\u062f\u064a\u0645 \u0642\u0631\u0627\u0631 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0628\u0627\u0644\u062a\u062d\u0648\u064a\u0644 \u0639\u0646 \u0637\u0631\u064a\u0642 \u0646\u0638\u0627\u0645 \u0642\u0631\u0627\u0631\u0627\u062a \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0627\u0644\u0633\u0627\u0628\u0642 \u0642\u064a \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a \u0644\u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629 . \n \u2022\u0623\u0644\u0627 \u064a\u0642\u0644 \u0639\u0645\u0631 \u0645\u0642\u062f\u0645 \u0627\u0644\u0637\u0644\u0628 \u0639\u0646 18 \u0633\u0646\u0629. \n \u2022\u0623\u0644\u0627 \u064a\u0643\u0648\u0646 \u0645\u0627\u0644\u0643 \u0627\u0644\u0645\u0624\u0633\u0633\u0629 \u0645\u0648\u0638\u0641 \u062d\u0643\u0648\u0645\u064a. \n \u2022\u0623\u0644\u0627 \u064a\u0643\u0648\u0646 \u0644\u062f\u0649 \u0645\u0627\u0644\u0643 \u0627\u0644\u0645\u0624\u0633\u0633\u0629 \u0633\u062c\u0644 \u062a\u062c\u0627\u0631\u064a \u0642\u0627\u0626\u0645 (\u0645\u0624\u0633\u0633\u0629 \u0642\u0627\u0626\u0645\u0629). \n \u2022\u0623\u0644\u0627 \u064a\u0643\u0648\u0646 \u0639\u0644\u0649 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0627\u0644\u0645\u0631\u0627\u062f \u062a\u062d\u0648\u064a\u0644\u0647 \u0623\u064a \u0625\u064a\u0642\u0627\u0641. \n \u2022\u0623\u0644\u0627\u064a\u0643\u0648\u0646 \u0627\u0644\u0633\u062c\u0644 \u0642\u0627\u0626\u0645 ( \u063a\u064a\u0631 \u0645\u0646\u062a\u0647\u064a ) . \n \u2022\u0639\u062f\u0645 \u0648\u062c\u0648\u062f \u0637\u0644\u0628\u0627\u062a \u0645\u0639\u0644\u0642\u0629 \u0639\u0644\u0649 \u0646\u0641\u0633 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a .", "source": "URL {"id": "196a7f6e6203-1", "text": "\u2022\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0633\u0627\u0647\u0645\u0629 \u0623\u0648 \u0645\u0633\u0627\u0647\u0645\u0629 \u0645\u0628\u0633\u0637\u0629 \u064a\u062a\u0645 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u062e\u062f\u0645\u0629 \u0639\u0646 \u0637\u0631\u064a\u0642 \u0627\u0644\u0641\u0631\u0639 \u0627\u0644\u0631\u0642\u0645\u064a \u0645\u0639 \u062a\u0636\u0645\u064a\u0646 \u0625\u0631\u0641\u0627\u0642 \u0642\u0631\u0627\u0631 \u0627\u0644\u062c\u0645\u0639\u064a\u0629 \u0628\u0627\u0644\u062a\u062d\u0648\u064a\u0644 \u0628\u0639\u062f \u0646\u0634\u0631\u0647 \u0639\u0644\u0649 \u0646\u0638\u0627\u0645 \u0642\u0631\u0627\u0631 \u0627\u0644\u062c\u0645\u0639\u064a\u0627\u062a \u0627\u0644\u063a\u064a\u0631 \u0639\u0627\u062f\u064a\u0629.\n\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0625\u0644\u0649 \u0646\u0638\u0627\u0645 \u0642\u0631\u0627\u0631\u0627\u062a \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0627\u0644\u0633\u0627\u0628\u0642 \u0642\u064a \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a \u0644\u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629\u060c \u0648 \u062a\u0642\u062f\u064a\u0645 \u0637\u0644\u0628 \u062a\u062d\u0648\u064a\u0644\u060c \u062b\u0645 \u062a\u062a\u0645 \u062f\u0631\u0627\u0633\u0629 \u0627\u0644\u0637\u0644\u0628 \u0648 \u0627\u0639\u062a\u0645\u0627\u062f\u0647. \n 2\u2022\u0625\u0635\u062f\u0627\u0631 \u0641\u0627\u062a\u0648\u0631\u0629 \u0633\u062f\u0627\u062f . \n 3\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0631\u0633\u0648\u0645 \u0648 \u062a\u0648\u062b\u064a\u0642 \u0627\u0644\u0637\u0644\u0628 \u0644\u062f\u0649 \u0645\u0648\u0638\u0641 \u0627\u0644\u0648\u0632\u0627\u0631\u0629 \u0648\u0646\u0634\u0631 \u0627\u0644\u062a\u062d\u0648\u064a\u0644 \u0627\u0644\u0643\u062a\u0631\u0648\u0646\u064a\u064b\u0627. \n 4\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u0641\u0631\u0639 \u0627\u0644\u0631\u0642\u0645\u064a \u0648 \u062a\u0639\u0628\u0626\u0629 \u0646\u0645\u0648\u0630\u062c \u0637\u0644\u0628 \u0625\u0635\u062f\u0627\u0631 \u0633\u062c\u0644 \u062a\u062c\u0627\u0631\u064a. \n 5\u2022\u0625\u0631\u0633\u0627\u0644 \u0627\u0644\u0637\u0644\u0628 \u0648 \u0627\u0639\u062a\u0645\u0627\u062f\u0647. \n 6\u2022\u0628\u0639\u062f \u0633\u062f\u0627\u062f \u0627\u0644\u0631\u0633\u0648\u0645 \u064a\u062a\u0645 \u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0633\u062c\u0644.", "source": "URL {"id": "196a7f6e6203-2", "text": "\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: 1500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% .\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a \u0644\u0644\u062e\u062f\u0645\u0629 : \n \u0642\u0631\u0627\u0631 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 ( \u0644\u062a\u0642\u062f\u064a\u0645 \u0637\u0644\u0628 \u0627\u0644\u062a\u062d\u0648\u064a\u0644 ) \n URL \n \u0627\u0644\u0641\u0631\u0639 \u0627\u0644\u0631\u0642\u0645\u064a ( \u0637\u0644\u0628 \u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a ) \n URL "source": "URL
[]
[ "TAGS\n#task_categories-question-answering #language-Arabic #license-apache-2.0 #not-for-all-audiences #region-us \n" ]
f7bf88f1bd6676dbe7a2c504527df59cbc779182
# Dataset of paprika/パプリカ/明椒 (Arknights) This is the dataset of paprika/パプリカ/明椒 (Arknights), containing 18 images and their tags. The core tags of this character are `long_hair, horns, bangs, black_hair, brown_hair, brown_eyes, multicolored_hair, red_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 18 | 30.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/paprika_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 18 | 13.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/paprika_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 45 | 31.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/paprika_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 18 | 24.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/paprika_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 45 | 50.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/paprika_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/paprika_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------| | 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, long_sleeves, white_background, open_mouth, shirt, simple_background, smile, holding, jacket | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | long_sleeves | white_background | open_mouth | shirt | simple_background | smile | holding | jacket | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------------|:-------------------|:-------------|:--------|:--------------------|:--------|:----------|:---------| | 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/paprika_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T11:36:18+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T11:41:19+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of paprika/パプリカ/明椒 (Arknights) ====================================== This is the dataset of paprika/パプリカ/明椒 (Arknights), containing 18 images and their tags. The core tags of this character are 'long\_hair, horns, bangs, black\_hair, brown\_hair, brown\_eyes, multicolored\_hair, red\_hair', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
43ef75c5f2bbe16ee02a32a03ce2268981f7c73b
# Dataset of kirin_r_yato/キリンRヤトウ/麒麟R夜刀 (Arknights) This is the dataset of kirin_r_yato/キリンRヤトウ/麒麟R夜刀 (Arknights), containing 81 images and their tags. The core tags of this character are `long_hair, breasts, horns, blue_eyes, bangs, brown_hair, multicolored_hair, white_hair, pointy_ears, large_breasts, streaked_hair, hair_between_eyes, hairband, very_long_hair, medium_breasts, mole, mole_under_eye`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 81 | 186.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirin_r_yato_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 81 | 85.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirin_r_yato_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 217 | 189.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirin_r_yato_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 81 | 151.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirin_r_yato_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 217 | 304.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirin_r_yato_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/kirin_r_yato_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 28 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, midriff, navel, solo, stomach, cleavage, belt, looking_at_viewer, necklace, black_gloves, cowboy_shot, fur_trim, white_background, simple_background, garter_straps, thighhighs, holding, crop_top, detached_sleeves, standing, closed_mouth, single_horn, collarbone, skirt, bikini, smile | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | midriff | navel | solo | stomach | cleavage | belt | looking_at_viewer | necklace | black_gloves | cowboy_shot | fur_trim | white_background | simple_background | garter_straps | thighhighs | holding | crop_top | detached_sleeves | standing | closed_mouth | single_horn | collarbone | skirt | bikini | smile | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:----------|:--------|:-------|:----------|:-----------|:-------|:--------------------|:-----------|:---------------|:--------------|:-----------|:-------------------|:--------------------|:----------------|:-------------|:----------|:-----------|:-------------------|:-----------|:---------------|:--------------|:-------------|:--------|:---------|:--------| | 0 | 28 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/kirin_r_yato_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T11:36:22+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T11:57:20+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of kirin\_r\_yato/キリンRヤトウ/麒麟R夜刀 (Arknights) =================================================== This is the dataset of kirin\_r\_yato/キリンRヤトウ/麒麟R夜刀 (Arknights), containing 81 images and their tags. The core tags of this character are 'long\_hair, breasts, horns, blue\_eyes, bangs, brown\_hair, multicolored\_hair, white\_hair, pointy\_ears, large\_breasts, streaked\_hair, hair\_between\_eyes, hairband, very\_long\_hair, medium\_breasts, mole, mole\_under\_eye', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
9f51b4a23291691d1d2c7de36f4f20107a6c9f0e
# Dataset of proviso/プロヴァイゾ/但书 (Arknights) This is the dataset of proviso/プロヴァイゾ/但书 (Arknights), containing 13 images and their tags. The core tags of this character are `animal_ears, blue_eyes, short_hair, glasses, animal_ear_fluff, bangs, breasts, green_hair, tail, black-framed_eyewear, large_breasts, multicolored_hair, ahoge, grey_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 13 | 19.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/proviso_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 13 | 9.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/proviso_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 32 | 21.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/proviso_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 13 | 16.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/proviso_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 32 | 32.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/proviso_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/proviso_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, long_sleeves, open_jacket, id_card, lanyard, white_jacket, white_shirt, simple_background, belt, black_skirt, pantyhose, white_gloves, collarbone, holding, smile | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | long_sleeves | open_jacket | id_card | lanyard | white_jacket | white_shirt | simple_background | belt | black_skirt | pantyhose | white_gloves | collarbone | holding | smile | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:---------------|:--------------|:----------|:----------|:---------------|:--------------|:--------------------|:-------|:--------------|:------------|:---------------|:-------------|:----------|:--------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/proviso_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T11:36:25+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T11:41:23+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of proviso/プロヴァイゾ/但书 (Arknights) ======================================== This is the dataset of proviso/プロヴァイゾ/但书 (Arknights), containing 13 images and their tags. The core tags of this character are 'animal\_ears, blue\_eyes, short\_hair, glasses, animal\_ear\_fluff, bangs, breasts, green\_hair, tail, black-framed\_eyewear, large\_breasts, multicolored\_hair, ahoge, grey\_hair', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
650cfc8d30e1adcfadbbadf8d10c3dc91c8a7a4f
# Dataset Card for Evaluation run of decruz07/kellemar-DPO-7B-c <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [decruz07/kellemar-DPO-7B-c](https://huggingface.co/decruz07/kellemar-DPO-7B-c) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_decruz07__kellemar-DPO-7B-c", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T11:35:01.176663](https://huggingface.co/datasets/open-llm-leaderboard/details_decruz07__kellemar-DPO-7B-c/blob/main/results_2024-01-11T11-35-01.176663.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6384818948169925, "acc_stderr": 0.032244773650019175, "acc_norm": 0.6408980660124164, "acc_norm_stderr": 0.03288578301996733, "mc1": 0.3733170134638923, "mc1_stderr": 0.016932370557570634, "mc2": 0.5407842160887599, "mc2_stderr": 0.015285349287061248 }, "harness|arc:challenge|25": { "acc": 0.6168941979522184, "acc_stderr": 0.014206472661672876, "acc_norm": 0.6569965870307167, "acc_norm_stderr": 0.013872423223718164 }, "harness|hellaswag|10": { "acc": 0.6591316470822546, "acc_stderr": 0.00473032455662413, "acc_norm": 0.8498307110137423, "acc_norm_stderr": 0.0035650718701954478 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.0421850621536888, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.0421850621536888 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.037385206761196686, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.037385206761196686 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6943396226415094, "acc_stderr": 0.028353298073322666, "acc_norm": 0.6943396226415094, "acc_norm_stderr": 0.028353298073322666 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6184971098265896, "acc_stderr": 0.03703851193099521, "acc_norm": 0.6184971098265896, "acc_norm_stderr": 0.03703851193099521 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.04784060704105653, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.04784060704105653 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5702127659574469, "acc_stderr": 0.03236214467715564, "acc_norm": 0.5702127659574469, "acc_norm_stderr": 0.03236214467715564 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5310344827586206, "acc_stderr": 0.04158632762097828, "acc_norm": 0.5310344827586206, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.02544636563440678, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.02544636563440678 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677172, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677172 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.023287665127268552, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.023287665127268552 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.03192271569548302, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.03192271569548302 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.028606204289229862, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.028606204289229862 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8911917098445595, "acc_stderr": 0.02247325333276877, "acc_norm": 0.8911917098445595, "acc_norm_stderr": 0.02247325333276877 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.617948717948718, "acc_stderr": 0.024635549163908237, "acc_norm": 0.617948717948718, "acc_norm_stderr": 0.024635549163908237 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3111111111111111, "acc_stderr": 0.02822644674968352, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.02822644674968352 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.680672268907563, "acc_stderr": 0.030283995525884396, "acc_norm": 0.680672268907563, "acc_norm_stderr": 0.030283995525884396 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.03861557546255169, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.03861557546255169 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8330275229357799, "acc_stderr": 0.01599015488507338, "acc_norm": 0.8330275229357799, "acc_norm_stderr": 0.01599015488507338 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7990196078431373, "acc_stderr": 0.02812597226565437, "acc_norm": 0.7990196078431373, "acc_norm_stderr": 0.02812597226565437 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.02553010046023349, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.02553010046023349 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7085201793721974, "acc_stderr": 0.030500283176545847, "acc_norm": 0.7085201793721974, "acc_norm_stderr": 0.030500283176545847 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070416, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070416 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.039578354719809805, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.039578354719809805 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7914110429447853, "acc_stderr": 0.031921934489347235, "acc_norm": 0.7914110429447853, "acc_norm_stderr": 0.031921934489347235 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507333, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8275862068965517, "acc_stderr": 0.013507943909371803, "acc_norm": 0.8275862068965517, "acc_norm_stderr": 0.013507943909371803 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7196531791907514, "acc_stderr": 0.02418242749657761, "acc_norm": 0.7196531791907514, "acc_norm_stderr": 0.02418242749657761 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.31620111731843575, "acc_stderr": 0.015551673652172552, "acc_norm": 0.31620111731843575, "acc_norm_stderr": 0.015551673652172552 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7549019607843137, "acc_stderr": 0.024630048979824782, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.024630048979824782 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.684887459807074, "acc_stderr": 0.026385273703464485, "acc_norm": 0.684887459807074, "acc_norm_stderr": 0.026385273703464485 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7530864197530864, "acc_stderr": 0.02399350170904211, "acc_norm": 0.7530864197530864, "acc_norm_stderr": 0.02399350170904211 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4667535853976532, "acc_stderr": 0.01274197433389723, "acc_norm": 0.4667535853976532, "acc_norm_stderr": 0.01274197433389723 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6764705882352942, "acc_stderr": 0.028418208619406762, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.028418208619406762 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6699346405228758, "acc_stderr": 0.019023726160724553, "acc_norm": 0.6699346405228758, "acc_norm_stderr": 0.019023726160724553 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8159203980099502, "acc_stderr": 0.027403859410786845, "acc_norm": 0.8159203980099502, "acc_norm_stderr": 0.027403859410786845 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.3733170134638923, "mc1_stderr": 0.016932370557570634, "mc2": 0.5407842160887599, "mc2_stderr": 0.015285349287061248 }, "harness|winogrande|5": { "acc": 0.7829518547750592, "acc_stderr": 0.01158587171020941 }, "harness|gsm8k|5": { "acc": 0.5822592873388931, "acc_stderr": 0.013584820638504823 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_decruz07__kellemar-DPO-7B-c
[ "region:us" ]
2024-01-11T11:37:18+00:00
{"pretty_name": "Evaluation run of decruz07/kellemar-DPO-7B-c", "dataset_summary": "Dataset automatically created during the evaluation run of model [decruz07/kellemar-DPO-7B-c](https://huggingface.co/decruz07/kellemar-DPO-7B-c) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_decruz07__kellemar-DPO-7B-c\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T11:35:01.176663](https://huggingface.co/datasets/open-llm-leaderboard/details_decruz07__kellemar-DPO-7B-c/blob/main/results_2024-01-11T11-35-01.176663.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6384818948169925,\n \"acc_stderr\": 0.032244773650019175,\n \"acc_norm\": 0.6408980660124164,\n \"acc_norm_stderr\": 0.03288578301996733,\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5407842160887599,\n \"mc2_stderr\": 0.015285349287061248\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6168941979522184,\n \"acc_stderr\": 0.014206472661672876,\n \"acc_norm\": 0.6569965870307167,\n \"acc_norm_stderr\": 0.013872423223718164\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6591316470822546,\n \"acc_stderr\": 0.00473032455662413,\n \"acc_norm\": 0.8498307110137423,\n \"acc_norm_stderr\": 0.0035650718701954478\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268552,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268552\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548302,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548302\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229862,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229862\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.617948717948718,\n \"acc_stderr\": 0.024635549163908237,\n \"acc_norm\": 0.617948717948718,\n \"acc_norm_stderr\": 0.024635549163908237\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.030500283176545847,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.030500283176545847\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31620111731843575,\n \"acc_stderr\": 0.015551673652172552,\n \"acc_norm\": 0.31620111731843575,\n \"acc_norm_stderr\": 0.015551673652172552\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824782,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824782\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464485,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464485\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.01274197433389723,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.01274197433389723\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406762,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406762\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5407842160887599,\n \"mc2_stderr\": 0.015285349287061248\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.01158587171020941\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5822592873388931,\n \"acc_stderr\": 0.013584820638504823\n }\n}\n```", "repo_url": "https://huggingface.co/decruz07/kellemar-DPO-7B-c", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|arc:challenge|25_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|gsm8k|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hellaswag|10_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T11-35-01.176663.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["**/details_harness|winogrande|5_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T11-35-01.176663.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T11_35_01.176663", "path": ["results_2024-01-11T11-35-01.176663.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T11-35-01.176663.parquet"]}]}]}
2024-01-11T11:37:49+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of decruz07/kellemar-DPO-7B-c Dataset automatically created during the evaluation run of model decruz07/kellemar-DPO-7B-c on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T11:35:01.176663(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of decruz07/kellemar-DPO-7B-c\n\n\n\nDataset automatically created during the evaluation run of model decruz07/kellemar-DPO-7B-c on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T11:35:01.176663(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of decruz07/kellemar-DPO-7B-c\n\n\n\nDataset automatically created during the evaluation run of model decruz07/kellemar-DPO-7B-c on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T11:35:01.176663(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
aebc4016262af6896ec57d7282f28c72fee0e3a5
# Dataset Card for Evaluation run of decruz07/kellemar-DPO-7B-d <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [decruz07/kellemar-DPO-7B-d](https://huggingface.co/decruz07/kellemar-DPO-7B-d) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_decruz07__kellemar-DPO-7B-d", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T11:40:20.341969](https://huggingface.co/datasets/open-llm-leaderboard/details_decruz07__kellemar-DPO-7B-d/blob/main/results_2024-01-11T11-40-20.341969.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6308988893048868, "acc_stderr": 0.032276070451063066, "acc_norm": 0.6321793841494536, "acc_norm_stderr": 0.03292460282734645, "mc1": 0.39657282741738065, "mc1_stderr": 0.017124930942023518, "mc2": 0.5687577354154723, "mc2_stderr": 0.015420590532587394 }, "harness|arc:challenge|25": { "acc": 0.6271331058020477, "acc_stderr": 0.014131176760131169, "acc_norm": 0.6689419795221843, "acc_norm_stderr": 0.013752062419817832 }, "harness|hellaswag|10": { "acc": 0.666301533559052, "acc_stderr": 0.0047056977452221566, "acc_norm": 0.8516231826329417, "acc_norm_stderr": 0.0035474663103254042 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.26, "acc_stderr": 0.04408440022768079, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5851851851851851, "acc_stderr": 0.04256193767901408, "acc_norm": 0.5851851851851851, "acc_norm_stderr": 0.04256193767901408 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7171052631578947, "acc_stderr": 0.03665349695640767, "acc_norm": 0.7171052631578947, "acc_norm_stderr": 0.03665349695640767 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6716981132075471, "acc_stderr": 0.028901593612411784, "acc_norm": 0.6716981132075471, "acc_norm_stderr": 0.028901593612411784 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7361111111111112, "acc_stderr": 0.03685651095897532, "acc_norm": 0.7361111111111112, "acc_norm_stderr": 0.03685651095897532 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6011560693641619, "acc_stderr": 0.037336266553835096, "acc_norm": 0.6011560693641619, "acc_norm_stderr": 0.037336266553835096 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3235294117647059, "acc_stderr": 0.046550104113196177, "acc_norm": 0.3235294117647059, "acc_norm_stderr": 0.046550104113196177 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5404255319148936, "acc_stderr": 0.03257901482099835, "acc_norm": 0.5404255319148936, "acc_norm_stderr": 0.03257901482099835 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.0470070803355104, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.0470070803355104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5103448275862069, "acc_stderr": 0.04165774775728762, "acc_norm": 0.5103448275862069, "acc_norm_stderr": 0.04165774775728762 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.025379524910778405, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.025379524910778405 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7741935483870968, "acc_stderr": 0.023785577884181012, "acc_norm": 0.7741935483870968, "acc_norm_stderr": 0.023785577884181012 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.793939393939394, "acc_stderr": 0.031584153240477114, "acc_norm": 0.793939393939394, "acc_norm_stderr": 0.031584153240477114 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.02937661648494562, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.02937661648494562 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8860103626943006, "acc_stderr": 0.022935144053919436, "acc_norm": 0.8860103626943006, "acc_norm_stderr": 0.022935144053919436 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6102564102564103, "acc_stderr": 0.024726967886647074, "acc_norm": 0.6102564102564103, "acc_norm_stderr": 0.024726967886647074 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3037037037037037, "acc_stderr": 0.028037929969114996, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.028037929969114996 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8348623853211009, "acc_stderr": 0.015919557829976037, "acc_norm": 0.8348623853211009, "acc_norm_stderr": 0.015919557829976037 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.803921568627451, "acc_stderr": 0.027865942286639325, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.027865942286639325 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.026361651668389094, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.026361651668389094 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.030769352008229136, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.030769352008229136 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7592592592592593, "acc_stderr": 0.04133119440243838, "acc_norm": 0.7592592592592593, "acc_norm_stderr": 0.04133119440243838 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.032262193772867744, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.032262193772867744 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.02280138253459754, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.02280138253459754 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8212005108556832, "acc_stderr": 0.013702643715368983, "acc_norm": 0.8212005108556832, "acc_norm_stderr": 0.013702643715368983 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7225433526011561, "acc_stderr": 0.024105712607754307, "acc_norm": 0.7225433526011561, "acc_norm_stderr": 0.024105712607754307 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3195530726256983, "acc_stderr": 0.015595520294147394, "acc_norm": 0.3195530726256983, "acc_norm_stderr": 0.015595520294147394 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7418300653594772, "acc_stderr": 0.02505850331695814, "acc_norm": 0.7418300653594772, "acc_norm_stderr": 0.02505850331695814 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6977491961414791, "acc_stderr": 0.02608270069539967, "acc_norm": 0.6977491961414791, "acc_norm_stderr": 0.02608270069539967 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.024288533637726095, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.024288533637726095 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5141843971631206, "acc_stderr": 0.02981549448368206, "acc_norm": 0.5141843971631206, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47131681877444587, "acc_stderr": 0.012749206007657476, "acc_norm": 0.47131681877444587, "acc_norm_stderr": 0.012749206007657476 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6617647058823529, "acc_stderr": 0.028739328513983572, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.028739328513983572 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6699346405228758, "acc_stderr": 0.019023726160724553, "acc_norm": 0.6699346405228758, "acc_norm_stderr": 0.019023726160724553 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.028666857790274648, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.028666857790274648 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8208955223880597, "acc_stderr": 0.027113286753111837, "acc_norm": 0.8208955223880597, "acc_norm_stderr": 0.027113286753111837 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.0377525168068637, "acc_norm": 0.83, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685517, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685517 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.39657282741738065, "mc1_stderr": 0.017124930942023518, "mc2": 0.5687577354154723, "mc2_stderr": 0.015420590532587394 }, "harness|winogrande|5": { "acc": 0.7932123125493291, "acc_stderr": 0.011382566829235798 }, "harness|gsm8k|5": { "acc": 0.620166793025019, "acc_stderr": 0.013368818096960496 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_decruz07__kellemar-DPO-7B-d
[ "region:us" ]
2024-01-11T11:42:39+00:00
{"pretty_name": "Evaluation run of decruz07/kellemar-DPO-7B-d", "dataset_summary": "Dataset automatically created during the evaluation run of model [decruz07/kellemar-DPO-7B-d](https://huggingface.co/decruz07/kellemar-DPO-7B-d) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_decruz07__kellemar-DPO-7B-d\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T11:40:20.341969](https://huggingface.co/datasets/open-llm-leaderboard/details_decruz07__kellemar-DPO-7B-d/blob/main/results_2024-01-11T11-40-20.341969.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6308988893048868,\n \"acc_stderr\": 0.032276070451063066,\n \"acc_norm\": 0.6321793841494536,\n \"acc_norm_stderr\": 0.03292460282734645,\n \"mc1\": 0.39657282741738065,\n \"mc1_stderr\": 0.017124930942023518,\n \"mc2\": 0.5687577354154723,\n \"mc2_stderr\": 0.015420590532587394\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.014131176760131169,\n \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.013752062419817832\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.666301533559052,\n \"acc_stderr\": 0.0047056977452221566,\n \"acc_norm\": 0.8516231826329417,\n \"acc_norm_stderr\": 0.0035474663103254042\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.028901593612411784,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.028901593612411784\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.031584153240477114,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.031584153240477114\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114996,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114996\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976037,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976037\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3195530726256983,\n \"acc_stderr\": 0.015595520294147394,\n \"acc_norm\": 0.3195530726256983,\n \"acc_norm_stderr\": 0.015595520294147394\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539967,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539967\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.012749206007657476,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.012749206007657476\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39657282741738065,\n \"mc1_stderr\": 0.017124930942023518,\n \"mc2\": 0.5687577354154723,\n \"mc2_stderr\": 0.015420590532587394\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235798\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.620166793025019,\n \"acc_stderr\": 0.013368818096960496\n }\n}\n```", "repo_url": "https://huggingface.co/decruz07/kellemar-DPO-7B-d", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|arc:challenge|25_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|gsm8k|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hellaswag|10_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T11-40-20.341969.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["**/details_harness|winogrande|5_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T11-40-20.341969.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T11_40_20.341969", "path": ["results_2024-01-11T11-40-20.341969.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T11-40-20.341969.parquet"]}]}]}
2024-01-11T11:43:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of decruz07/kellemar-DPO-7B-d Dataset automatically created during the evaluation run of model decruz07/kellemar-DPO-7B-d on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T11:40:20.341969(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of decruz07/kellemar-DPO-7B-d\n\n\n\nDataset automatically created during the evaluation run of model decruz07/kellemar-DPO-7B-d on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T11:40:20.341969(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of decruz07/kellemar-DPO-7B-d\n\n\n\nDataset automatically created during the evaluation run of model decruz07/kellemar-DPO-7B-d on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T11:40:20.341969(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
dd9f18f4616206332217314161740ad7f5b4ac7b
# Dataset Card for "pubmed-2048" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
anumafzal94/pubmed-2048
[ "region:us" ]
2024-01-11T11:43:59+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 126100060, "num_examples": 6592}, {"name": "train", "num_bytes": 96394526.44590327, "num_examples": 5000}, {"name": "validation", "num_bytes": 19355274.16145754, "num_examples": 1005}], "download_size": 78306070, "dataset_size": 241849860.6073608}}
2024-01-11T11:44:19+00:00
[]
[]
TAGS #region-us
# Dataset Card for "pubmed-2048" More Information needed
[ "# Dataset Card for \"pubmed-2048\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"pubmed-2048\"\n\nMore Information needed" ]
1cc059749384d01a94793a681fa2d28de56b1add
# Dataset Card for Evaluation run of proto-llm/uniwiz-7B-v0.2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [proto-llm/uniwiz-7B-v0.2](https://huggingface.co/proto-llm/uniwiz-7B-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_proto-llm__uniwiz-7B-v0.2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T11:59:37.867165](https://huggingface.co/datasets/open-llm-leaderboard/details_proto-llm__uniwiz-7B-v0.2/blob/main/results_2024-01-11T11-59-37.867165.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6349386109714982, "acc_stderr": 0.03234622354979693, "acc_norm": 0.6405443013835485, "acc_norm_stderr": 0.032995556354475986, "mc1": 0.42105263157894735, "mc1_stderr": 0.017283936248136487, "mc2": 0.5991296817497732, "mc2_stderr": 0.01542474423315055 }, "harness|arc:challenge|25": { "acc": 0.6006825938566553, "acc_stderr": 0.014312094557946705, "acc_norm": 0.6331058020477816, "acc_norm_stderr": 0.014084133118104296 }, "harness|hellaswag|10": { "acc": 0.6687910774746066, "acc_stderr": 0.0046968616254969234, "acc_norm": 0.8507269468233419, "acc_norm_stderr": 0.003556291232050351 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.0421850621536888, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.0421850621536888 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6513157894736842, "acc_stderr": 0.038781398887976104, "acc_norm": 0.6513157894736842, "acc_norm_stderr": 0.038781398887976104 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.690566037735849, "acc_stderr": 0.028450154794118637, "acc_norm": 0.690566037735849, "acc_norm_stderr": 0.028450154794118637 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7152777777777778, "acc_stderr": 0.037738099906869334, "acc_norm": 0.7152777777777778, "acc_norm_stderr": 0.037738099906869334 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.8, "acc_stderr": 0.04020151261036846, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146267, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146267 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370333, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370333 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.37037037037037035, "acc_stderr": 0.0248708152510571, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.0248708152510571 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.38095238095238093, "acc_stderr": 0.04343525428949098, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.04343525428949098 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7548387096774194, "acc_stderr": 0.024472243840895525, "acc_norm": 0.7548387096774194, "acc_norm_stderr": 0.024472243840895525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5221674876847291, "acc_stderr": 0.03514528562175007, "acc_norm": 0.5221674876847291, "acc_norm_stderr": 0.03514528562175007 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.04793724854411019, "acc_norm": 0.65, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009182, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009182 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7676767676767676, "acc_stderr": 0.03008862949021749, "acc_norm": 0.7676767676767676, "acc_norm_stderr": 0.03008862949021749 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8497409326424871, "acc_stderr": 0.02578772318072387, "acc_norm": 0.8497409326424871, "acc_norm_stderr": 0.02578772318072387 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6512820512820513, "acc_stderr": 0.02416278028401772, "acc_norm": 0.6512820512820513, "acc_norm_stderr": 0.02416278028401772 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3037037037037037, "acc_stderr": 0.028037929969114986, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.028037929969114986 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6218487394957983, "acc_stderr": 0.03149930577784906, "acc_norm": 0.6218487394957983, "acc_norm_stderr": 0.03149930577784906 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8110091743119267, "acc_stderr": 0.01678548115920363, "acc_norm": 0.8110091743119267, "acc_norm_stderr": 0.01678548115920363 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.034063153607115086, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.034063153607115086 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7745098039215687, "acc_stderr": 0.02933116229425174, "acc_norm": 0.7745098039215687, "acc_norm_stderr": 0.02933116229425174 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7426160337552743, "acc_stderr": 0.0284588209914603, "acc_norm": 0.7426160337552743, "acc_norm_stderr": 0.0284588209914603 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6771300448430493, "acc_stderr": 0.031381476375754995, "acc_norm": 0.6771300448430493, "acc_norm_stderr": 0.031381476375754995 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.0364129708131373, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.0364129708131373 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098825, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098825 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8148148148148148, "acc_stderr": 0.03755265865037181, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.03755265865037181 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.803680981595092, "acc_stderr": 0.031207970394709225, "acc_norm": 0.803680981595092, "acc_norm_stderr": 0.031207970394709225 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5535714285714286, "acc_stderr": 0.04718471485219587, "acc_norm": 0.5535714285714286, "acc_norm_stderr": 0.04718471485219587 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406943, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406943 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8071519795657727, "acc_stderr": 0.014108533515757433, "acc_norm": 0.8071519795657727, "acc_norm_stderr": 0.014108533515757433 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7283236994219653, "acc_stderr": 0.023948512905468348, "acc_norm": 0.7283236994219653, "acc_norm_stderr": 0.023948512905468348 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.39888268156424583, "acc_stderr": 0.016376966142610073, "acc_norm": 0.39888268156424583, "acc_norm_stderr": 0.016376966142610073 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7483660130718954, "acc_stderr": 0.0248480182638752, "acc_norm": 0.7483660130718954, "acc_norm_stderr": 0.0248480182638752 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.025922371788818767, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.025922371788818767 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7345679012345679, "acc_stderr": 0.02456922360046085, "acc_norm": 0.7345679012345679, "acc_norm_stderr": 0.02456922360046085 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5, "acc_stderr": 0.029827499313594685, "acc_norm": 0.5, "acc_norm_stderr": 0.029827499313594685 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4426336375488918, "acc_stderr": 0.01268590653820624, "acc_norm": 0.4426336375488918, "acc_norm_stderr": 0.01268590653820624 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6654411764705882, "acc_stderr": 0.028661996202335303, "acc_norm": 0.6654411764705882, "acc_norm_stderr": 0.028661996202335303 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6699346405228758, "acc_stderr": 0.019023726160724553, "acc_norm": 0.6699346405228758, "acc_norm_stderr": 0.019023726160724553 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8557213930348259, "acc_stderr": 0.024845753212306053, "acc_norm": 0.8557213930348259, "acc_norm_stderr": 0.024845753212306053 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.03588702812826371, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.038695433234721015, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.038695433234721015 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.42105263157894735, "mc1_stderr": 0.017283936248136487, "mc2": 0.5991296817497732, "mc2_stderr": 0.01542474423315055 }, "harness|winogrande|5": { "acc": 0.7782162588792423, "acc_stderr": 0.011676109244497813 }, "harness|gsm8k|5": { "acc": 0.3752843062926459, "acc_stderr": 0.01333717054574293 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_proto-llm__uniwiz-7B-v0.2
[ "region:us" ]
2024-01-11T12:01:57+00:00
{"pretty_name": "Evaluation run of proto-llm/uniwiz-7B-v0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [proto-llm/uniwiz-7B-v0.2](https://huggingface.co/proto-llm/uniwiz-7B-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_proto-llm__uniwiz-7B-v0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T11:59:37.867165](https://huggingface.co/datasets/open-llm-leaderboard/details_proto-llm__uniwiz-7B-v0.2/blob/main/results_2024-01-11T11-59-37.867165.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6349386109714982,\n \"acc_stderr\": 0.03234622354979693,\n \"acc_norm\": 0.6405443013835485,\n \"acc_norm_stderr\": 0.032995556354475986,\n \"mc1\": 0.42105263157894735,\n \"mc1_stderr\": 0.017283936248136487,\n \"mc2\": 0.5991296817497732,\n \"mc2_stderr\": 0.01542474423315055\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.014312094557946705,\n \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104296\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6687910774746066,\n \"acc_stderr\": 0.0046968616254969234,\n \"acc_norm\": 0.8507269468233419,\n \"acc_norm_stderr\": 0.003556291232050351\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.038781398887976104,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.038781398887976104\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.0248708152510571,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.0248708152510571\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895525,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.03149930577784906,\n \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.03149930577784906\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8110091743119267,\n \"acc_stderr\": 0.01678548115920363,\n \"acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.01678548115920363\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.034063153607115086,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.034063153607115086\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7426160337552743,\n \"acc_stderr\": 0.0284588209914603,\n \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.0284588209914603\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098825,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098825\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709225,\n \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709225\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n \"acc_stderr\": 0.04718471485219587,\n \"acc_norm\": 0.5535714285714286,\n \"acc_norm_stderr\": 0.04718471485219587\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406943,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406943\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n \"acc_stderr\": 0.014108533515757433,\n \"acc_norm\": 0.8071519795657727,\n \"acc_norm_stderr\": 0.014108533515757433\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468348,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468348\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n \"acc_stderr\": 0.016376966142610073,\n \"acc_norm\": 0.39888268156424583,\n \"acc_norm_stderr\": 0.016376966142610073\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.0248480182638752,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.0248480182638752\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.02456922360046085,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.02456922360046085\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n \"acc_stderr\": 0.01268590653820624,\n \"acc_norm\": 0.4426336375488918,\n \"acc_norm_stderr\": 0.01268590653820624\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42105263157894735,\n \"mc1_stderr\": 0.017283936248136487,\n \"mc2\": 0.5991296817497732,\n \"mc2_stderr\": 0.01542474423315055\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497813\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3752843062926459,\n \"acc_stderr\": 0.01333717054574293\n }\n}\n```", "repo_url": "https://huggingface.co/proto-llm/uniwiz-7B-v0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|arc:challenge|25_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|gsm8k|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hellaswag|10_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T11-59-37.867165.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["**/details_harness|winogrande|5_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T11-59-37.867165.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T11_59_37.867165", "path": ["results_2024-01-11T11-59-37.867165.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T11-59-37.867165.parquet"]}]}]}
2024-01-11T12:02:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of proto-llm/uniwiz-7B-v0.2 Dataset automatically created during the evaluation run of model proto-llm/uniwiz-7B-v0.2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T11:59:37.867165(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of proto-llm/uniwiz-7B-v0.2\n\n\n\nDataset automatically created during the evaluation run of model proto-llm/uniwiz-7B-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T11:59:37.867165(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of proto-llm/uniwiz-7B-v0.2\n\n\n\nDataset automatically created during the evaluation run of model proto-llm/uniwiz-7B-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T11:59:37.867165(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
57ccb5939c7a3cadd623ab3ffa910264b04f9d1b
# Dataset Card for Evaluation run of tuantran1632001/Psyfighter2-Orca2-ties <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [tuantran1632001/Psyfighter2-Orca2-ties](https://huggingface.co/tuantran1632001/Psyfighter2-Orca2-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_tuantran1632001__Psyfighter2-Orca2-ties", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T12:03:20.679754](https://huggingface.co/datasets/open-llm-leaderboard/details_tuantran1632001__Psyfighter2-Orca2-ties/blob/main/results_2024-01-11T12-03-20.679754.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.603383654987928, "acc_stderr": 0.03303267584618269, "acc_norm": 0.607142680047232, "acc_norm_stderr": 0.033700954867739115, "mc1": 0.39167686658506734, "mc1_stderr": 0.01708779588176963, "mc2": 0.5540489547722205, "mc2_stderr": 0.01582448369078134 }, "harness|arc:challenge|25": { "acc": 0.5878839590443686, "acc_stderr": 0.014383915302225403, "acc_norm": 0.6245733788395904, "acc_norm_stderr": 0.01415063143511173 }, "harness|hellaswag|10": { "acc": 0.6296554471220872, "acc_stderr": 0.00481910045686781, "acc_norm": 0.8173670583549094, "acc_norm_stderr": 0.0038557568514415437 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.04218506215368879, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.04218506215368879 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.03782728980865469, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.03782728980865469 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6226415094339622, "acc_stderr": 0.029832808114796, "acc_norm": 0.6226415094339622, "acc_norm_stderr": 0.029832808114796 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6597222222222222, "acc_stderr": 0.039621355734862175, "acc_norm": 0.6597222222222222, "acc_norm_stderr": 0.039621355734862175 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5491329479768786, "acc_stderr": 0.0379401267469703, "acc_norm": 0.5491329479768786, "acc_norm_stderr": 0.0379401267469703 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.04576665403207763, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.04576665403207763 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5404255319148936, "acc_stderr": 0.03257901482099835, "acc_norm": 0.5404255319148936, "acc_norm_stderr": 0.03257901482099835 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3157894736842105, "acc_stderr": 0.04372748290278008, "acc_norm": 0.3157894736842105, "acc_norm_stderr": 0.04372748290278008 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555497, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555497 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.36772486772486773, "acc_stderr": 0.024833839825562417, "acc_norm": 0.36772486772486773, "acc_norm_stderr": 0.024833839825562417 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.35714285714285715, "acc_stderr": 0.04285714285714281, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.04285714285714281 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7419354838709677, "acc_stderr": 0.024892469172462826, "acc_norm": 0.7419354838709677, "acc_norm_stderr": 0.024892469172462826 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7515151515151515, "acc_stderr": 0.03374402644139404, "acc_norm": 0.7515151515151515, "acc_norm_stderr": 0.03374402644139404 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7626262626262627, "acc_stderr": 0.030313710538198906, "acc_norm": 0.7626262626262627, "acc_norm_stderr": 0.030313710538198906 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8497409326424871, "acc_stderr": 0.025787723180723875, "acc_norm": 0.8497409326424871, "acc_norm_stderr": 0.025787723180723875 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6128205128205129, "acc_stderr": 0.024697216930878948, "acc_norm": 0.6128205128205129, "acc_norm_stderr": 0.024697216930878948 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253255, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6428571428571429, "acc_stderr": 0.031124619309328177, "acc_norm": 0.6428571428571429, "acc_norm_stderr": 0.031124619309328177 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8055045871559633, "acc_stderr": 0.016970289090458026, "acc_norm": 0.8055045871559633, "acc_norm_stderr": 0.016970289090458026 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.46296296296296297, "acc_stderr": 0.03400603625538271, "acc_norm": 0.46296296296296297, "acc_norm_stderr": 0.03400603625538271 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7892156862745098, "acc_stderr": 0.02862654791243741, "acc_norm": 0.7892156862745098, "acc_norm_stderr": 0.02862654791243741 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290923, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290923 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7040358744394619, "acc_stderr": 0.030636591348699796, "acc_norm": 0.7040358744394619, "acc_norm_stderr": 0.030636591348699796 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7099236641221374, "acc_stderr": 0.03980066246467765, "acc_norm": 0.7099236641221374, "acc_norm_stderr": 0.03980066246467765 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7177914110429447, "acc_stderr": 0.03536117886664742, "acc_norm": 0.7177914110429447, "acc_norm_stderr": 0.03536117886664742 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.38392857142857145, "acc_stderr": 0.04616143075028547, "acc_norm": 0.38392857142857145, "acc_norm_stderr": 0.04616143075028547 }, "harness|hendrycksTest-management|5": { "acc": 0.7378640776699029, "acc_stderr": 0.04354631077260597, "acc_norm": 0.7378640776699029, "acc_norm_stderr": 0.04354631077260597 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.02158649400128136, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.02158649400128136 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.789272030651341, "acc_stderr": 0.014583812465862557, "acc_norm": 0.789272030651341, "acc_norm_stderr": 0.014583812465862557 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.653179190751445, "acc_stderr": 0.025624723994030454, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.025624723994030454 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3452513966480447, "acc_stderr": 0.015901432608930354, "acc_norm": 0.3452513966480447, "acc_norm_stderr": 0.015901432608930354 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6470588235294118, "acc_stderr": 0.027363593284684965, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.027363593284684965 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.684887459807074, "acc_stderr": 0.026385273703464492, "acc_norm": 0.684887459807074, "acc_norm_stderr": 0.026385273703464492 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6851851851851852, "acc_stderr": 0.02584224870090217, "acc_norm": 0.6851851851851852, "acc_norm_stderr": 0.02584224870090217 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.475177304964539, "acc_stderr": 0.029790719243829727, "acc_norm": 0.475177304964539, "acc_norm_stderr": 0.029790719243829727 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4322033898305085, "acc_stderr": 0.012652297777114968, "acc_norm": 0.4322033898305085, "acc_norm_stderr": 0.012652297777114968 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6029411764705882, "acc_stderr": 0.029722152099280065, "acc_norm": 0.6029411764705882, "acc_norm_stderr": 0.029722152099280065 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.630718954248366, "acc_stderr": 0.019524316744866356, "acc_norm": 0.630718954248366, "acc_norm_stderr": 0.019524316744866356 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.046075820907199756, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.046075820907199756 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7020408163265306, "acc_stderr": 0.02927956741106567, "acc_norm": 0.7020408163265306, "acc_norm_stderr": 0.02927956741106567 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7512437810945274, "acc_stderr": 0.030567675938916714, "acc_norm": 0.7512437810945274, "acc_norm_stderr": 0.030567675938916714 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.03861229196653694, "acc_norm": 0.82, "acc_norm_stderr": 0.03861229196653694 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7953216374269005, "acc_stderr": 0.03094445977853321, "acc_norm": 0.7953216374269005, "acc_norm_stderr": 0.03094445977853321 }, "harness|truthfulqa:mc|0": { "mc1": 0.39167686658506734, "mc1_stderr": 0.01708779588176963, "mc2": 0.5540489547722205, "mc2_stderr": 0.01582448369078134 }, "harness|winogrande|5": { "acc": 0.7726913970007893, "acc_stderr": 0.011778612167091088 }, "harness|gsm8k|5": { "acc": 0.43669446550416985, "acc_stderr": 0.013661649780905488 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_tuantran1632001__Psyfighter2-Orca2-ties
[ "region:us" ]
2024-01-11T12:05:41+00:00
{"pretty_name": "Evaluation run of tuantran1632001/Psyfighter2-Orca2-ties", "dataset_summary": "Dataset automatically created during the evaluation run of model [tuantran1632001/Psyfighter2-Orca2-ties](https://huggingface.co/tuantran1632001/Psyfighter2-Orca2-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tuantran1632001__Psyfighter2-Orca2-ties\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T12:03:20.679754](https://huggingface.co/datasets/open-llm-leaderboard/details_tuantran1632001__Psyfighter2-Orca2-ties/blob/main/results_2024-01-11T12-03-20.679754.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.603383654987928,\n \"acc_stderr\": 0.03303267584618269,\n \"acc_norm\": 0.607142680047232,\n \"acc_norm_stderr\": 0.033700954867739115,\n \"mc1\": 0.39167686658506734,\n \"mc1_stderr\": 0.01708779588176963,\n \"mc2\": 0.5540489547722205,\n \"mc2_stderr\": 0.01582448369078134\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225403,\n \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.01415063143511173\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6296554471220872,\n \"acc_stderr\": 0.00481910045686781,\n \"acc_norm\": 0.8173670583549094,\n \"acc_norm_stderr\": 0.0038557568514415437\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796,\n \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.04372748290278008,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.04372748290278008\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562417,\n \"acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562417\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n \"acc_stderr\": 0.024892469172462826,\n \"acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.024892469172462826\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139404,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139404\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198906,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198906\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723875,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723875\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878948,\n \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878948\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8055045871559633,\n \"acc_stderr\": 0.016970289090458026,\n \"acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.016970289090458026\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243741,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243741\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290923,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290923\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260597,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260597\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128136,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128136\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.789272030651341,\n \"acc_stderr\": 0.014583812465862557,\n \"acc_norm\": 0.789272030651341,\n \"acc_norm_stderr\": 0.014583812465862557\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3452513966480447,\n \"acc_stderr\": 0.015901432608930354,\n \"acc_norm\": 0.3452513966480447,\n \"acc_norm_stderr\": 0.015901432608930354\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.027363593284684965,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.027363593284684965\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.02584224870090217,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.02584224870090217\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4322033898305085,\n \"acc_stderr\": 0.012652297777114968,\n \"acc_norm\": 0.4322033898305085,\n \"acc_norm_stderr\": 0.012652297777114968\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.029722152099280065,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.029722152099280065\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.019524316744866356,\n \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.019524316744866356\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106567,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106567\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39167686658506734,\n \"mc1_stderr\": 0.01708779588176963,\n \"mc2\": 0.5540489547722205,\n \"mc2_stderr\": 0.01582448369078134\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7726913970007893,\n \"acc_stderr\": 0.011778612167091088\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.43669446550416985,\n \"acc_stderr\": 0.013661649780905488\n }\n}\n```", "repo_url": "https://huggingface.co/tuantran1632001/Psyfighter2-Orca2-ties", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|arc:challenge|25_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|gsm8k|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hellaswag|10_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T12-03-20.679754.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["**/details_harness|winogrande|5_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T12-03-20.679754.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T12_03_20.679754", "path": ["results_2024-01-11T12-03-20.679754.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T12-03-20.679754.parquet"]}]}]}
2024-01-11T12:06:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of tuantran1632001/Psyfighter2-Orca2-ties Dataset automatically created during the evaluation run of model tuantran1632001/Psyfighter2-Orca2-ties on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T12:03:20.679754(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of tuantran1632001/Psyfighter2-Orca2-ties\n\n\n\nDataset automatically created during the evaluation run of model tuantran1632001/Psyfighter2-Orca2-ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T12:03:20.679754(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of tuantran1632001/Psyfighter2-Orca2-ties\n\n\n\nDataset automatically created during the evaluation run of model tuantran1632001/Psyfighter2-Orca2-ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T12:03:20.679754(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
ef5eab53316dcc420677517058bd090a80259cd8
# Dataset of wind_chimes/ウィンドチャイム/铎铃 (Arknights) This is the dataset of wind_chimes/ウィンドチャイム/铎铃 (Arknights), containing 12 images and their tags. The core tags of this character are `breasts, horns, animal_ears, long_hair, bangs, black_hair, cow_ears, cow_horns, brown_hair, large_breasts, multicolored_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 12 | 18.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wind_chimes_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 12 | 9.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wind_chimes_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 31 | 22.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wind_chimes_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 12 | 15.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wind_chimes_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 31 | 31.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wind_chimes_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/wind_chimes_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, navel, crop_top, jacket, midriff, simple_background, smile, bare_shoulders, belt, white_background, white_shirt, black_shorts, fingerless_gloves, open_mouth, sleeveless_shirt, standing | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | navel | crop_top | jacket | midriff | simple_background | smile | bare_shoulders | belt | white_background | white_shirt | black_shorts | fingerless_gloves | open_mouth | sleeveless_shirt | standing | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:-----------|:---------|:----------|:--------------------|:--------|:-----------------|:-------|:-------------------|:--------------|:---------------|:--------------------|:-------------|:-------------------|:-----------| | 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/wind_chimes_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T12:05:42+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T12:08:30+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of wind\_chimes/ウィンドチャイム/铎铃 (Arknights) =============================================== This is the dataset of wind\_chimes/ウィンドチャイム/铎铃 (Arknights), containing 12 images and their tags. The core tags of this character are 'breasts, horns, animal\_ears, long\_hair, bangs, black\_hair, cow\_ears, cow\_horns, brown\_hair, large\_breasts, multicolored\_hair', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
f02dc3f9b79bbcf2073f8511642bf86338f5d570
# Dataset of santalla/寒檀 (Arknights) This is the dataset of santalla/寒檀 (Arknights), containing 12 images and their tags. The core tags of this character are `animal_ears, breasts, hair_over_one_eye, long_hair, large_breasts, yellow_eyes, hat, white_hair, animal_ear_fluff, tail, bangs, white_headwear`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 12 | 24.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/santalla_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 12 | 11.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/santalla_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 25 | 20.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/santalla_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 12 | 19.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/santalla_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 25 | 33.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/santalla_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/santalla_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------| | 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, fur_trim, gloves, coat, bare_shoulders, belt, blush, simple_background | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | fur_trim | gloves | coat | bare_shoulders | belt | blush | simple_background | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-----------|:---------|:-------|:-----------------|:-------|:--------|:--------------------| | 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X |
CyberHarem/santalla_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T12:07:45+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T12:11:37+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of santalla/寒檀 (Arknights) ================================== This is the dataset of santalla/寒檀 (Arknights), containing 12 images and their tags. The core tags of this character are 'animal\_ears, breasts, hair\_over\_one\_eye, long\_hair, large\_breasts, yellow\_eyes, hat, white\_hair, animal\_ear\_fluff, tail, bangs, white\_headwear', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
597278ae359471b85da934a914835ddbdd311696
# Dataset of frost/Frost/霜华 (Arknights) This is the dataset of frost/Frost/霜华 (Arknights), containing 69 images and their tags. The core tags of this character are `black_hair, hat, short_hair, breasts, large_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 69 | 68.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/frost_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 69 | 40.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/frost_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 151 | 75.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/frost_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 69 | 61.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/frost_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 151 | 104.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/frost_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/frost_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, beanie, mouth_mask, solo, looking_at_viewer, simple_background, black_headwear, black_eyes, white_background, artist_name, ass, barefoot, feet, toes | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_headwear, blue_eyes, looking_at_viewer, solo, beanie, black_gloves, fur_trim, jacket, long_sleeves, pants, holding_gun, knee_pads, tactical_clothes, bangs, closed_mouth, holster, mole_under_eye, rifle, standing, coat, knife, outdoors, pouch, snow | | 2 | 12 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1boy, 1girl, hetero, solo_focus, penis, beanie, nipples, uncensored, mouth_mask, nude, erection, fellatio | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | beanie | mouth_mask | solo | looking_at_viewer | simple_background | black_headwear | black_eyes | white_background | artist_name | ass | barefoot | feet | toes | blue_eyes | black_gloves | fur_trim | jacket | long_sleeves | pants | holding_gun | knee_pads | tactical_clothes | bangs | closed_mouth | holster | mole_under_eye | rifle | standing | coat | knife | outdoors | pouch | snow | 1boy | hetero | solo_focus | penis | nipples | uncensored | nude | erection | fellatio | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------------|:-------|:--------------------|:--------------------|:-----------------|:-------------|:-------------------|:--------------|:------|:-----------|:-------|:-------|:------------|:---------------|:-----------|:---------|:---------------|:--------|:--------------|:------------|:-------------------|:--------|:---------------|:----------|:-----------------|:--------|:-----------|:-------|:--------|:-----------|:--------|:-------|:-------|:---------|:-------------|:--------|:----------|:-------------|:-------|:-----------|:-----------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | 2 | 12 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
CyberHarem/frost_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T12:07:53+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T12:36:55+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of frost/Frost/霜华 (Arknights) ===================================== This is the dataset of frost/Frost/霜华 (Arknights), containing 69 images and their tags. The core tags of this character are 'black\_hair, hat, short\_hair, breasts, large\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
70c4dd1d7a61084c1878449d16c4ad1aa247aee2
# TravelPlanner Dataset TravelPlanner is a benchmark crafted for evaluating language agents in tool-use and complex planning within multiple constraints. (See our [paper](https://arxiv.org/pdf/2402.01622.pdf) for more details.) ## Introduction In TravelPlanner, for a given query, language agents are expected to formulate a comprehensive plan that includes transportation, daily meals, attractions, and accommodation for each day. TravelPlanner comprises 1,225 queries in total. The number of days and hard constraints are designed to test agents' abilities across both the breadth and depth of complex planning. ## Split <b>Train Set</b>: 5 queries with corresponding human-annotated plans for group, resulting in a total of 45 query-plan pairs. This set provides the human annotated plans as demonstrations for in-context learning. <b>Validation Set</b>: 20 queries from each group, amounting to 180 queries in total. There is no human annotated plan in this set. <b>Test Set</b>: 1,000 randomly distributed queries. To avoid data contamination, we only provide the level, days, and natural language query fields. ## Record Layout - "org": The city from where the journey begins. - "dest": The destination city. - "days": The number of days planned for the trip. - "visiting_city_number": The total number of cities included in the itinerary. - "date": The specific date when the travel is scheduled. - "people_numbe": The total number of people involved in the travel. - "local_constraint": The local hard constraint, including house rule, cuisine, room type and transportation. - "query": A natural language description or request related to the travel plan. - "level": The difficulty level, which is determined by the number of hard constraints. - "annotated_plan": A detailed travel plan annotated by a human, ensuring compliance with all common sense requirements and specific hard constraints. - "reference_information": Reference information for "sole-planning" mode. ## Citation If our paper or related resources prove valuable to your research, we kindly ask for citation. Please feel free to contact us with any inquiries. ```bib @article{Xie2024TravelPlanner, author = {Jian Xie, Kai Zhang, Jiangjie Chen, Tinghui Zhu, Renze Lou, Yuandong Tian, Yanghua Xiao, Yu Su}, title = {TravelPlanner: A Benchmark for Real-World Planning with Language Agents}, journal = {arXiv preprint arXiv: 2402.01622}, year = {2024} } ```
osunlp/TravelPlanner
[ "license:cc-by-4.0", "arxiv:2402.01622", "region:us" ]
2024-01-11T12:10:14+00:00
{"license": "cc-by-4.0", "configs": [{"config_name": "train", "data_files": [{"split": "train", "path": "train.csv"}]}, {"config_name": "validation", "data_files": [{"split": "validation", "path": "validation.csv"}]}, {"config_name": "test", "data_files": [{"split": "test", "path": "test.csv"}]}]}
2024-02-05T02:28:17+00:00
[ "2402.01622" ]
[]
TAGS #license-cc-by-4.0 #arxiv-2402.01622 #region-us
# TravelPlanner Dataset TravelPlanner is a benchmark crafted for evaluating language agents in tool-use and complex planning within multiple constraints. (See our paper for more details.) ## Introduction In TravelPlanner, for a given query, language agents are expected to formulate a comprehensive plan that includes transportation, daily meals, attractions, and accommodation for each day. TravelPlanner comprises 1,225 queries in total. The number of days and hard constraints are designed to test agents' abilities across both the breadth and depth of complex planning. ## Split <b>Train Set</b>: 5 queries with corresponding human-annotated plans for group, resulting in a total of 45 query-plan pairs. This set provides the human annotated plans as demonstrations for in-context learning. <b>Validation Set</b>: 20 queries from each group, amounting to 180 queries in total. There is no human annotated plan in this set. <b>Test Set</b>: 1,000 randomly distributed queries. To avoid data contamination, we only provide the level, days, and natural language query fields. ## Record Layout - "org": The city from where the journey begins. - "dest": The destination city. - "days": The number of days planned for the trip. - "visiting_city_number": The total number of cities included in the itinerary. - "date": The specific date when the travel is scheduled. - "people_numbe": The total number of people involved in the travel. - "local_constraint": The local hard constraint, including house rule, cuisine, room type and transportation. - "query": A natural language description or request related to the travel plan. - "level": The difficulty level, which is determined by the number of hard constraints. - "annotated_plan": A detailed travel plan annotated by a human, ensuring compliance with all common sense requirements and specific hard constraints. - "reference_information": Reference information for "sole-planning" mode. If our paper or related resources prove valuable to your research, we kindly ask for citation. Please feel free to contact us with any inquiries.
[ "# TravelPlanner Dataset\n\nTravelPlanner is a benchmark crafted for evaluating language agents in tool-use and complex planning within multiple constraints. (See our paper for more details.)", "## Introduction\n\nIn TravelPlanner, for a given query, language agents are expected to formulate a comprehensive plan that includes transportation, daily meals, attractions, and accommodation for each day.\n\nTravelPlanner comprises 1,225 queries in total. The number of days and hard constraints are designed to test agents' abilities across both the breadth and depth of complex planning.", "## Split\n\n<b>Train Set</b>: 5 queries with corresponding human-annotated plans for group, resulting in a total of 45 query-plan pairs. This set provides the human annotated plans as demonstrations for in-context learning.\n\n<b>Validation Set</b>: 20 queries from each group, amounting to 180 queries in total. There is no human annotated plan in this set.\n\n<b>Test Set</b>: 1,000 randomly distributed queries. To avoid data contamination, we only provide the level, days, and natural language query fields.", "## Record Layout\n\n- \"org\": The city from where the journey begins.\n- \"dest\": The destination city.\n- \"days\": The number of days planned for the trip.\n- \"visiting_city_number\": The total number of cities included in the itinerary.\n- \"date\": The specific date when the travel is scheduled.\n- \"people_numbe\": The total number of people involved in the travel.\n- \"local_constraint\": The local hard constraint, including house rule, cuisine, room type and transportation.\n- \"query\": A natural language description or request related to the travel plan.\n- \"level\": The difficulty level, which is determined by the number of hard constraints.\n- \"annotated_plan\": A detailed travel plan annotated by a human, ensuring compliance with all common sense requirements and specific hard constraints.\n- \"reference_information\": Reference information for \"sole-planning\" mode.\n\nIf our paper or related resources prove valuable to your research, we kindly ask for citation. Please feel free to contact us with any inquiries." ]
[ "TAGS\n#license-cc-by-4.0 #arxiv-2402.01622 #region-us \n", "# TravelPlanner Dataset\n\nTravelPlanner is a benchmark crafted for evaluating language agents in tool-use and complex planning within multiple constraints. (See our paper for more details.)", "## Introduction\n\nIn TravelPlanner, for a given query, language agents are expected to formulate a comprehensive plan that includes transportation, daily meals, attractions, and accommodation for each day.\n\nTravelPlanner comprises 1,225 queries in total. The number of days and hard constraints are designed to test agents' abilities across both the breadth and depth of complex planning.", "## Split\n\n<b>Train Set</b>: 5 queries with corresponding human-annotated plans for group, resulting in a total of 45 query-plan pairs. This set provides the human annotated plans as demonstrations for in-context learning.\n\n<b>Validation Set</b>: 20 queries from each group, amounting to 180 queries in total. There is no human annotated plan in this set.\n\n<b>Test Set</b>: 1,000 randomly distributed queries. To avoid data contamination, we only provide the level, days, and natural language query fields.", "## Record Layout\n\n- \"org\": The city from where the journey begins.\n- \"dest\": The destination city.\n- \"days\": The number of days planned for the trip.\n- \"visiting_city_number\": The total number of cities included in the itinerary.\n- \"date\": The specific date when the travel is scheduled.\n- \"people_numbe\": The total number of people involved in the travel.\n- \"local_constraint\": The local hard constraint, including house rule, cuisine, room type and transportation.\n- \"query\": A natural language description or request related to the travel plan.\n- \"level\": The difficulty level, which is determined by the number of hard constraints.\n- \"annotated_plan\": A detailed travel plan annotated by a human, ensuring compliance with all common sense requirements and specific hard constraints.\n- \"reference_information\": Reference information for \"sole-planning\" mode.\n\nIf our paper or related resources prove valuable to your research, we kindly ask for citation. Please feel free to contact us with any inquiries." ]
9851b7be16f8800f20f8f80fa914c1d1c1e47f8e
# Medical-Calgary-Cambridge-single-turn-llama2-300 296 single-turn conversational entries between a patient and doctor, using the Calgary-Cambridge model
kazcfz/Medical-Calgary-Cambridge-single-turn-llama2-300
[ "region:us" ]
2024-01-11T12:25:00+00:00
{}
2024-01-11T13:37:12+00:00
[]
[]
TAGS #region-us
# Medical-Calgary-Cambridge-single-turn-llama2-300 296 single-turn conversational entries between a patient and doctor, using the Calgary-Cambridge model
[ "# Medical-Calgary-Cambridge-single-turn-llama2-300 \n296 single-turn conversational entries between a patient and doctor, using the Calgary-Cambridge model" ]
[ "TAGS\n#region-us \n", "# Medical-Calgary-Cambridge-single-turn-llama2-300 \n296 single-turn conversational entries between a patient and doctor, using the Calgary-Cambridge model" ]
ff647b4960588c9b3308d1bc853c80f637e6ecc2
# Dataset Card for TowerEval-Data TowerEval-Data is the suite of datasets used to evaluate [Tower](https://huggingface.co/collections/Unbabel/tower-7b-v01-659eaedfe36e6dd29eb1805c), language models specialized for translation tasks such as machine translation (e.g. general, document, terminology-aware or context-aware translation), automatic post edition, named-entity recognition, gramatical error correction, and paraphrase generation. For generation and evaluation code, see our repo [`tower-eval`](https://github.com/deep-spin/tower-eval). - **Curated by:** Unbabel, Instituto Superior Técnico, CentraleSupélec, University of Paris-Saclay; - **Language(s) (NLP):** English, Portuguese, Spanish, French, German, Dutch, Italian, Korean, Chinese, Russian; - **License:** TowerEval contains data from many sources. We refer to the respective data sources below for information regarding licensing of the data. ## Dataset Details TowerEval contains 0- and few-shot instructions created and corresponding raw data from the following sources: | Data Source | Task(s) | | -------------- | ----------- | | [Flores](https://github.com/facebookresearch/flores) | General Translation | | [WMT23](https://www2.statmt.org/wmt23/translation-task.html#_data) | General Translation | | [TICO-19](https://tico-19.github.io/) | Domain-specific Translation | | [WMT23](https://www2.statmt.org/wmt23/translation-task.html#_data) | Automatic Post Edition (NLLB 3B translations on WMT23 test data) | | [MultiCoNER II](https://multiconer.github.io/) | Named Entity Recognition (1000 randomly selected test instances) | | [CoNLL-2014](https://www.comp.nus.edu.sg/~nlp/conll14st.html) | Grammatical Error Correction | | [COWS-L2H](https://github.com/abhisaary/spanish_gec) | Grammatical Error Correction | | [mlconvgec2018](https://github.com/adrianeboyd/boyd-wnut2018/) | Grammatical Error Correction | ## Intended uses and limitations TowerEval-Data is intended to be used to evaluate large language models on translation and related tasks. Check out our [repo](https://github.com/deep-spin/tower-eval) for details on how to use the data. ## Citation To be completed.
Unbabel/TowerEval-Data-v0.1
[ "task_categories:translation", "task_categories:text-generation", "size_categories:10K<n<100K", "language:en", "language:de", "language:fr", "language:zh", "language:pt", "language:nl", "language:ru", "language:ko", "language:it", "language:es", "region:us" ]
2024-01-11T12:44:56+00:00
{"language": ["en", "de", "fr", "zh", "pt", "nl", "ru", "ko", "it", "es"], "size_categories": ["10K<n<100K"], "task_categories": ["translation", "text-generation"]}
2024-02-13T01:41:41+00:00
[]
[ "en", "de", "fr", "zh", "pt", "nl", "ru", "ko", "it", "es" ]
TAGS #task_categories-translation #task_categories-text-generation #size_categories-10K<n<100K #language-English #language-German #language-French #language-Chinese #language-Portuguese #language-Dutch #language-Russian #language-Korean #language-Italian #language-Spanish #region-us
Dataset Card for TowerEval-Data =============================== TowerEval-Data is the suite of datasets used to evaluate Tower, language models specialized for translation tasks such as machine translation (e.g. general, document, terminology-aware or context-aware translation), automatic post edition, named-entity recognition, gramatical error correction, and paraphrase generation. For generation and evaluation code, see our repo 'tower-eval'. * Curated by: Unbabel, Instituto Superior Técnico, CentraleSupélec, University of Paris-Saclay; * Language(s) (NLP): English, Portuguese, Spanish, French, German, Dutch, Italian, Korean, Chinese, Russian; * License: TowerEval contains data from many sources. We refer to the respective data sources below for information regarding licensing of the data. Dataset Details --------------- TowerEval contains 0- and few-shot instructions created and corresponding raw data from the following sources: Intended uses and limitations ----------------------------- TowerEval-Data is intended to be used to evaluate large language models on translation and related tasks. Check out our repo for details on how to use the data. To be completed.
[]
[ "TAGS\n#task_categories-translation #task_categories-text-generation #size_categories-10K<n<100K #language-English #language-German #language-French #language-Chinese #language-Portuguese #language-Dutch #language-Russian #language-Korean #language-Italian #language-Spanish #region-us \n" ]
d4617cc7be46c5e2624ce84a674922631e7c03b3
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** Zui Chen & Yezeng Chen - **Language(s) (NLP):** Chinese & English & Code - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** GSM8K & Math & TAL-SCQ ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
cyzhh/TAL-SCQ-CN_mix
[ "task_categories:question-answering", "size_categories:10K<n<100K", "Math", "region:us" ]
2024-01-11T12:45:06+00:00
{"size_categories": ["10K<n<100K"], "task_categories": ["question-answering"], "tags": ["Math"]}
2024-01-11T12:54:24+00:00
[]
[]
TAGS #task_categories-question-answering #size_categories-10K<n<100K #Math #region-us
# Dataset Card for Dataset Name This dataset card aims to be a base template for new datasets. It has been generated using this raw template. ## Dataset Details ### Dataset Description - Curated by: Zui Chen & Yezeng Chen - Language(s) (NLP): Chinese & English & Code - License: ### Dataset Sources [optional] - Repository: GSM8K & Math & TAL-SCQ ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: Zui Chen & Yezeng Chen \n- Language(s) (NLP): Chinese & English & Code \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: GSM8K & Math & TAL-SCQ", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#task_categories-question-answering #size_categories-10K<n<100K #Math #region-us \n", "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: Zui Chen & Yezeng Chen \n- Language(s) (NLP): Chinese & English & Code \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: GSM8K & Math & TAL-SCQ", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
a7689fb14b335cdc9ffb60095639c31f4ed3738a
# Dataset of ichika/仲正イチカ/一花 (Blue Archive) This is the dataset of ichika/仲正イチカ/一花 (Blue Archive), containing 500 images and their tags. The core tags of this character are `long_hair, black_hair, bangs, halo, hair_ornament, hairclip, wings, black_wings, breasts, low_wings, feathered_wings, blue_eyes, large_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 939.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ichika_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 408.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ichika_bluearchive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1300 | 924.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ichika_bluearchive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 765.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ichika_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1300 | 1.51 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ichika_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/ichika_bluearchive', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, black_sailor_collar, black_serafuku, black_shirt, black_skirt, long_sleeves, midriff, pleated_skirt, red_neckerchief, simple_background, solo, white_background, armband, black_choker, black_gloves, blush, cowboy_shot, smile, closed_eyes, navel, closed_mouth, crop_top_overhang, parted_lips, arms_up, miniskirt, very_long_hair | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, armband, black_choker, black_gloves, black_sailor_collar, black_serafuku, black_skirt, long_sleeves, looking_at_viewer, pleated_skirt, red_neckerchief, shirt, simple_background, smile, solo, white_background, blush, closed_mouth | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blush, completely_nude, hetero, navel, nipples, solo_focus, 1boy, penis, black_choker, cowgirl_position, girl_on_top, sex, vaginal, pussy, stomach, sweat, collarbone, looking_at_viewer, medium_breasts, pov, simple_background, closed_eyes, grin, mosaic_censoring, spread_legs | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, black_bikini, blush, navel, simple_background, solo, black_choker, collarbone, smile, stomach, white_background, alternate_costume, cleavage, closed_eyes, closed_mouth, cowboy_shot, micro_bikini, string_bikini, sweat, very_long_hair, armpits, black_gloves, groin, skindentation, standing, thighs | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_sailor_collar | black_serafuku | black_shirt | black_skirt | long_sleeves | midriff | pleated_skirt | red_neckerchief | simple_background | solo | white_background | armband | black_choker | black_gloves | blush | cowboy_shot | smile | closed_eyes | navel | closed_mouth | crop_top_overhang | parted_lips | arms_up | miniskirt | very_long_hair | looking_at_viewer | shirt | completely_nude | hetero | nipples | solo_focus | 1boy | penis | cowgirl_position | girl_on_top | sex | vaginal | pussy | stomach | sweat | collarbone | medium_breasts | pov | grin | mosaic_censoring | spread_legs | black_bikini | alternate_costume | cleavage | micro_bikini | string_bikini | armpits | groin | skindentation | standing | thighs | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------------|:-----------------|:--------------|:--------------|:---------------|:----------|:----------------|:------------------|:--------------------|:-------|:-------------------|:----------|:---------------|:---------------|:--------|:--------------|:--------|:--------------|:--------|:---------------|:--------------------|:--------------|:----------|:------------|:-----------------|:--------------------|:--------|:------------------|:---------|:----------|:-------------|:-------|:--------|:-------------------|:--------------|:------|:----------|:--------|:----------|:--------|:-------------|:-----------------|:------|:-------|:-------------------|:--------------|:---------------|:--------------------|:-----------|:---------------|:----------------|:----------|:--------|:----------------|:-----------|:---------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | X | X | | X | X | X | X | X | X | X | X | X | | X | | | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | | | | | | | X | | | | X | | X | | | X | X | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | | | | | | | X | X | X | | X | X | X | X | X | X | X | X | | | | | X | | | | | | | | | | | | | | X | X | X | | | | | | X | X | X | X | X | X | X | X | X | X |
CyberHarem/ichika_bluearchive
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-11T12:55:28+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T15:23:43+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of ichika/仲正イチカ/一花 (Blue Archive) ========================================= This is the dataset of ichika/仲正イチカ/一花 (Blue Archive), containing 500 images and their tags. The core tags of this character are 'long\_hair, black\_hair, bangs, halo, hair\_ornament, hairclip, wings, black\_wings, breasts, low\_wings, feathered\_wings, blue\_eyes, large\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
7f48159fd39e62b7b4ddb3dfc0d189d6a5619862
# Gutenberg DPO ![gutenberg](gutenberg.png) ## Overview This is a dataset meant to enhance novel writing capabilities of LLMs, by using public domain books from [Project Gutenberg](https://gutenberg.org/) ## Process First, the each book is parsed, split into chapters, cleaned up from the original format (remove superfluous newlines, illustration tags, etc.). Once we have chapters, an LLM is prompted with each chapter to create a synthetic prompt that would result in that chapter being written. Each chapter has a summary created as well, so that the prompts for each chapter after the also include a summary of the previous chapter to provide additional context. We then use the synthetic prompt with previous chapter summary to write the chapter with an LLM (llama-2-13b-chat, bagel-7b-v0.1, dolphin-2.2-34b). The human written text, that is, the original chapter, is used as the "chosen" value, and the LLM written chapter is used as the rejected value. ## Books used These books were chosen main because they appeared in the popular section on project gutenberg, and they function correctly with the chapterize library. - Huckleberry Finn - Treasure Island - Anna Karenina - Uncle Tom’s Cabin - Wuthering Heights - Madame Bovary - The Turn of the Screw - The War of the Worlds - A Study in Scarlet - Middlemarch - Pride and Prejudice - The Brothers Karamazov - Through the Looking Glass - Moby Dick - Frankenstein - A Tale of Two Cities
jondurbin/gutenberg-dpo-v0.1
[ "size_categories:n<1K", "language:en", "license:cc-by-4.0", "dpo", "region:us" ]
2024-01-11T13:15:41+00:00
{"language": ["en"], "license": "cc-by-4.0", "size_categories": ["n<1K"], "pretty_name": "Gutenberg DPO", "tags": ["dpo"]}
2024-01-12T13:05:37+00:00
[]
[ "en" ]
TAGS #size_categories-n<1K #language-English #license-cc-by-4.0 #dpo #region-us
# Gutenberg DPO !gutenberg ## Overview This is a dataset meant to enhance novel writing capabilities of LLMs, by using public domain books from Project Gutenberg ## Process First, the each book is parsed, split into chapters, cleaned up from the original format (remove superfluous newlines, illustration tags, etc.). Once we have chapters, an LLM is prompted with each chapter to create a synthetic prompt that would result in that chapter being written. Each chapter has a summary created as well, so that the prompts for each chapter after the also include a summary of the previous chapter to provide additional context. We then use the synthetic prompt with previous chapter summary to write the chapter with an LLM (llama-2-13b-chat, bagel-7b-v0.1, dolphin-2.2-34b). The human written text, that is, the original chapter, is used as the "chosen" value, and the LLM written chapter is used as the rejected value. ## Books used These books were chosen main because they appeared in the popular section on project gutenberg, and they function correctly with the chapterize library. - Huckleberry Finn - Treasure Island - Anna Karenina - Uncle Tom’s Cabin - Wuthering Heights - Madame Bovary - The Turn of the Screw - The War of the Worlds - A Study in Scarlet - Middlemarch - Pride and Prejudice - The Brothers Karamazov - Through the Looking Glass - Moby Dick - Frankenstein - A Tale of Two Cities
[ "# Gutenberg DPO\n\n!gutenberg", "## Overview\n\nThis is a dataset meant to enhance novel writing capabilities of LLMs, by using public domain books from Project Gutenberg", "## Process\n\nFirst, the each book is parsed, split into chapters, cleaned up from the original format (remove superfluous newlines, illustration tags, etc.).\n\nOnce we have chapters, an LLM is prompted with each chapter to create a synthetic prompt that would result in that chapter being written.\nEach chapter has a summary created as well, so that the prompts for each chapter after the also include a summary of the previous chapter to provide additional context.\n\nWe then use the synthetic prompt with previous chapter summary to write the chapter with an LLM (llama-2-13b-chat, bagel-7b-v0.1, dolphin-2.2-34b).\nThe human written text, that is, the original chapter, is used as the \"chosen\" value, and the LLM written chapter is used as the rejected value.", "## Books used\n\nThese books were chosen main because they appeared in the popular section on project gutenberg, and they function correctly with the chapterize library.\n\n- Huckleberry Finn\n- Treasure Island\n- Anna Karenina\n- Uncle Tom’s Cabin\n- Wuthering Heights\n- Madame Bovary\n- The Turn of the Screw\n- The War of the Worlds\n- A Study in Scarlet\n- Middlemarch\n- Pride and Prejudice\n- The Brothers Karamazov\n- Through the Looking Glass\n- Moby Dick\n- Frankenstein\n- A Tale of Two Cities" ]
[ "TAGS\n#size_categories-n<1K #language-English #license-cc-by-4.0 #dpo #region-us \n", "# Gutenberg DPO\n\n!gutenberg", "## Overview\n\nThis is a dataset meant to enhance novel writing capabilities of LLMs, by using public domain books from Project Gutenberg", "## Process\n\nFirst, the each book is parsed, split into chapters, cleaned up from the original format (remove superfluous newlines, illustration tags, etc.).\n\nOnce we have chapters, an LLM is prompted with each chapter to create a synthetic prompt that would result in that chapter being written.\nEach chapter has a summary created as well, so that the prompts for each chapter after the also include a summary of the previous chapter to provide additional context.\n\nWe then use the synthetic prompt with previous chapter summary to write the chapter with an LLM (llama-2-13b-chat, bagel-7b-v0.1, dolphin-2.2-34b).\nThe human written text, that is, the original chapter, is used as the \"chosen\" value, and the LLM written chapter is used as the rejected value.", "## Books used\n\nThese books were chosen main because they appeared in the popular section on project gutenberg, and they function correctly with the chapterize library.\n\n- Huckleberry Finn\n- Treasure Island\n- Anna Karenina\n- Uncle Tom’s Cabin\n- Wuthering Heights\n- Madame Bovary\n- The Turn of the Screw\n- The War of the Worlds\n- A Study in Scarlet\n- Middlemarch\n- Pride and Prejudice\n- The Brothers Karamazov\n- Through the Looking Glass\n- Moby Dick\n- Frankenstein\n- A Tale of Two Cities" ]
ebe725bb792931d268abeed3154850a69b577b08
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
Kkk111k/katya_lysenka_
[ "region:us" ]
2024-01-11T13:25:39+00:00
{}
2024-01-11T13:28:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for Dataset Name This dataset card aims to be a base template for new datasets. It has been generated using this raw template. ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
b4484cda5b577efcc958c5c3d4b78fe1086e2799
--- <h1 align="center">🌸 Haiku DPO 🌸</h1> <p align="center"> <img src="https://cdn-uploads.huggingface.co/production/uploads/60107b385ac3e86b3ea4fc34/veyblgmspfou3f3SgZxwX.png" alt="Your Image" width="500"> </p> <p align="center"><em>In data, words flow,<br> Teaching AI the art of<br> Haiku, line by line. </em></p> # Dataset Card for Haiku DPO [<img src="https://raw.githubusercontent.com/argilla-io/distilabel/main/docs/assets/distilabel-badge-dark.png" alt="Built with Distilabel" width="200" height="32"/>](https://github.com/argilla-io/distilabel) <!-- Provide a quick summary of the dataset. --> This a synthetic dataset of haikus. The dataset is constructed with the goal of helping to train LLMs to be more 'technically' competent at writing haikus. ## Dataset Details The data consists of a few different components that are described in more detail below but the key components are: - a column of synthetically generated user prompts requesting a haiku - a column consisting of multiple responses to this prompt, generated by a language model - a column consisting of scores for each of these responses, generated by a rule-based system The goal of this dataset was to help the author explore the process of synthesizing a dataset for DPO and to explore the extent to which DPO can be used to capture aesthetic preferences in language generation. Haiku also has the nice property of being relatively easy to score on a 'technical basis' i.e. do they follow the 5-7-5 syllable structure? As a result of this property, some relatively simple Python functions can be used to rate the technical quality of a haiku. By focusing on a narrower task, this dataset also intends to offer a place to explore questions such as: - should DPO datasets prioritize a large gap in scores between the 'best' and 'worst' generations? - Is more data better or is a bigger gap in scores better? I am also interested in exploring the extent to which smaller models can learn to perform well at a narrower task. Again, haiku writing here is a good candidate for this exploration as it is relatively narrow, the data is cheaper to generate and it is relatively easy to score on a technical basis so you don't need to rely on human annotation or a "judge" LM to score the generations. ### Dataset Description - **Curated by:** Daniel van Strien - **Language(s) (NLP):** English (synthetically generated) - **License:** Creative Commons Attribution 4.0 International License ## Uses This dataset can be used "as is" to help train LLMs to be more 'technically' competent at writing haikus. However, it is also intended as a "test bed" for exploring how different DPO qualities of a DPO dataset impact models trained on these datasets. ### Direct Use The `default` config can be used for training DPO models. The "chosen" and "rejected" columns contain the highest-quality and lowest-quality generations respectively. You may, however, want to filter the dataset in other ways to explore how different qualities of a DPO dataset impact the resulting model. ### Out-of-Scope Use This dataset was constructed with a rather narrow goal in mind. It is unlikely to be useful for other tasks. However, it may be useful as a test bed for exploring how different qualities of a DPO dataset impact the resulting model. ## Dataset Structure The dataset consists of a few different configurations: - `default`: this is likely to be the most useful one for most users. It contains the highest-quality and lowest-quality generations in the "chosen" and "rejected" columns respectively. It also contains the "difference_in_score" column which is the difference between the score of the highest-quality generation and the lowest-quality generation. This column can be used to filter the dataset to explore how different qualities of a DPO dataset impact the resulting model. The `default` configuration has the following columns: - 'question': the prompt requesting a haiku - 'generation_model': the name of the model used to generate the haiku - 'generation_prompt': the full prompt used to generate the haiku - 'generations': the haikus generated by the model - 'scores': the scores for each of the haikus - 'chosen': the highest-quality haiku - 'chosen_score': the score for the highest-quality haiku - 'rejected': the lowest-quality haiku - 'rejected_score': the score for the lowest-quality haiku - 'tie': whether the highest-quality and lowest-quality haikus have the same score - 'difference_in_score': the difference between the score of the highest-quality generation and the lowest-quality generation - 'system': the system prompt used during generation The `default` configuration removes ties and ensures the lowest quality generation has a score < below 3. More information on the scoring process is outlined below. The `rule_ranked` configuration is similar to the `default` configuration but it has not been filtered at all so will give you more scope for things like including ties in your dataset. ## Dataset Creation This dataset was generated using the [distilabel](https://github.com/argilla-io/distilabel) library using [teknium](https://huggingface.co/teknium)'s [OpenHermes-2.5-Mistral-7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B) model. The prompts were generated from a seed list of terms and an adapted version of the [SELF-INSTRUCT](https://arxiv.org/abs/2212.10560) papers prompting strategy. You can see more details about the process of generating these prompts in the associated dataset [davanstrien/haiku_prompts](https://huggingface.co/datasets/davanstrien/haiku_prompts). From these initial prompts, multiple generations of haiku were generated (again using teknium's OpenHermes-2.5-Mistral-7B model). These generations were then scored using a rule-based system. This rule system scored haikus out of a 4, with the following approach to scoring: If the haiku is not three lines it scores zero. Then for each line, 1 point is deducted if the line does not match the expected syllable count for that line. This means a haiku with three lines matching the traditional 5-7-5 syllable structure will score 4. A haiku with one line with an incorrect syllable count will score 3. The rule-based system is not perfect and there are some cases where it will incorrectly score a haiku. However, it is relatively easy to understand and it is relatively easy to score a haiku manually so it is a good candidate for a rule-based system. The code for this is shared in this [GitHub repository](https://github.com/davanstrien/haiku-dpo). ### Curation Rationale The dataset was curated with the following goals in mind: - to explore the process of using open models to generate synthetic datasets - to explore the use of rules for ranking generations - to explore how different slices of a DPO dataset impact the resulting model ### Source Data #### Data Collection and Processing See above for the process of generating the data. #### Who are the source data producers? Almost all of the data is synthetic. The prompts were generated using a seed list of terms and an adapted version of the [SELF-INSTRUCT](https://arxiv.org/abs/2212.10560) papers prompting strategy. The generations were generated using teknium's OpenHermes-2.5-Mistral-7B model. The scores were generated using a rule-based system. The initial prompt seed terms were generated by Daniel van Strien with some help from GPT-4. ### Annotations There are no traditional annotations in this dataset. However, the scores are generated using a rule-based system. #### Personal and Sensitive Information It is very unlikely that this dataset contains any personal or sensitive information, but if you find any prompts that you believe to be harmful, please open a discussion and I will remove them from the dataset. ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> Whilst I have not found any harmful prompts in the dataset, I have not manually validated all of the prompts. If you find any prompts which you believe to be harmful, please open a discussion and I will remove them from the dataset. ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> The original seed prompts used to generate this dataset are by no means comprehensive, and the dataset is likely to be biased toward the topics covered by the seed prompts. This dataset will likely develop over time. If you have any suggestions for additional seed prompts, please open a discussion and I will add them to the dataset. ## Citation [optional] I have zero expectation that this dataset will be cited, but if you do use it in your work, you can cite it as follows: **BibTeX:** ```bibtex @misc{vanstrien2021haiku, title={Haiku DPO}, author={{van Strien}, Daniel}, year={2024}, eprint={2110.00482}, publisher = {Hugging Face}, howpublished = {\url{https://huggingface.co/datasets/davanstrien/haiku_dpo}} } ``` ## Glossary <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> - DPO/Direct Preference Optimization: Introduced in [*Direct Preference Optimization: Your Language Model is Secretly a Reward Model*](https://huggingface.co/papers/2305.18290) - SELF-INSTRUCT: A prompting strategy introduced in [*Self-Instruct: Aligning Language Model with Self Generated Instructions*](https://huggingface.co/papers/2212.10560) ## Dataset Card Authors [davanstrien](https://huggingface.co/davanstrien) ## Dataset Card Contact [davanstrien](https://huggingface.co/davanstrien)
davanstrien/haiku_dpo
[ "task_categories:text-generation", "task_categories:reinforcement-learning", "task_categories:conversational", "size_categories:1K<n<10K", "license:cc-by-4.0", "dpo", "poetry", "synthetic", "arxiv:2212.10560", "arxiv:2110.00482", "arxiv:2305.18290", "region:us" ]
2024-01-11T13:32:12+00:00
{"license": "cc-by-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation", "reinforcement-learning", "conversational"], "pretty_name": "Haiku DPO", "dataset_info": [{"config_name": "default", "features": [{"name": "question", "dtype": "string"}, {"name": "generation_model", "sequence": "string"}, {"name": "generation_prompt", "sequence": "string"}, {"name": "generations", "sequence": "string"}, {"name": "scores", "sequence": "int64"}, {"name": "chosen", "dtype": "string"}, {"name": "chosen_score", "dtype": "int64"}, {"name": "rejected", "dtype": "string"}, {"name": "rejected_score", "dtype": "int64"}, {"name": "tie", "dtype": "bool"}, {"name": "difference_in_score", "dtype": "int64"}, {"name": "system", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 45631767, "num_examples": 4123}], "download_size": 3632867, "dataset_size": 45631767}, {"config_name": "raw", "features": [{"name": "prompt", "dtype": "string"}, {"name": "responses", "sequence": "string"}, {"name": "scores", "sequence": "int64"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "tie", "dtype": "bool"}, {"name": "difference_in_score", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 5462, "num_examples": 10}], "download_size": 9198, "dataset_size": 5462}, {"config_name": "raw-haikus", "features": [{"name": "input", "dtype": "string"}, {"name": "generation_model", "sequence": "string"}, {"name": "generation_prompt", "sequence": "string"}, {"name": "raw_generation_responses", "sequence": "string"}, {"name": "generations", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 52003027, "num_examples": 4303}], "download_size": 6328873, "dataset_size": 52003027}, {"config_name": "raw-scored-haikus", "features": [{"name": "input", "dtype": "string"}, {"name": "generation_model", "sequence": "string"}, {"name": "generation_prompt", "sequence": "string"}, {"name": "generations", "sequence": "string"}, {"name": "scores", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 26255574, "num_examples": 3220}], "download_size": 1986498, "dataset_size": 26255574}, {"config_name": "rule_ranked", "features": [{"name": "input", "dtype": "string"}, {"name": "generation_model", "sequence": "string"}, {"name": "generation_prompt", "sequence": "string"}, {"name": "generations", "sequence": "string"}, {"name": "scores", "sequence": "int64"}, {"name": "chosen", "dtype": "string"}, {"name": "chosen_score", "dtype": "int64"}, {"name": "rejected", "dtype": "string"}, {"name": "rejected_score", "dtype": "int64"}, {"name": "tie", "dtype": "bool"}, {"name": "difference_in_score", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 46515868, "num_examples": 4302}], "download_size": 3772778, "dataset_size": 46515868}], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}, {"config_name": "raw", "data_files": [{"split": "train", "path": "raw/train-*"}]}, {"config_name": "raw-haikus", "data_files": [{"split": "train", "path": "raw-haikus/train-*"}]}, {"config_name": "raw-scored-haikus", "data_files": [{"split": "train", "path": "raw-scored-haikus/train-*"}]}, {"config_name": "raw_prompts", "data_files": [{"split": "train", "path": "raw_prompts/train-*"}]}, {"config_name": "rule_ranked", "data_files": [{"split": "train", "path": "rule_ranked/train-*"}]}], "tags": ["dpo", "poetry", "synthetic"]}
2024-02-05T14:22:36+00:00
[ "2212.10560", "2110.00482", "2305.18290" ]
[]
TAGS #task_categories-text-generation #task_categories-reinforcement-learning #task_categories-conversational #size_categories-1K<n<10K #license-cc-by-4.0 #dpo #poetry #synthetic #arxiv-2212.10560 #arxiv-2110.00482 #arxiv-2305.18290 #region-us
--- <h1 align="center"> Haiku DPO </h1> <p align="center"> <img src="URL alt="Your Image" width="500"> </p> <p align="center"><em>In data, words flow,<br> Teaching AI the art of<br> Haiku, line by line. </em></p> # Dataset Card for Haiku DPO <img src="URL alt="Built with Distilabel" width="200" height="32"/> This a synthetic dataset of haikus. The dataset is constructed with the goal of helping to train LLMs to be more 'technically' competent at writing haikus. ## Dataset Details The data consists of a few different components that are described in more detail below but the key components are: - a column of synthetically generated user prompts requesting a haiku - a column consisting of multiple responses to this prompt, generated by a language model - a column consisting of scores for each of these responses, generated by a rule-based system The goal of this dataset was to help the author explore the process of synthesizing a dataset for DPO and to explore the extent to which DPO can be used to capture aesthetic preferences in language generation. Haiku also has the nice property of being relatively easy to score on a 'technical basis' i.e. do they follow the 5-7-5 syllable structure? As a result of this property, some relatively simple Python functions can be used to rate the technical quality of a haiku. By focusing on a narrower task, this dataset also intends to offer a place to explore questions such as: - should DPO datasets prioritize a large gap in scores between the 'best' and 'worst' generations? - Is more data better or is a bigger gap in scores better? I am also interested in exploring the extent to which smaller models can learn to perform well at a narrower task. Again, haiku writing here is a good candidate for this exploration as it is relatively narrow, the data is cheaper to generate and it is relatively easy to score on a technical basis so you don't need to rely on human annotation or a "judge" LM to score the generations. ### Dataset Description - Curated by: Daniel van Strien - Language(s) (NLP): English (synthetically generated) - License: Creative Commons Attribution 4.0 International License ## Uses This dataset can be used "as is" to help train LLMs to be more 'technically' competent at writing haikus. However, it is also intended as a "test bed" for exploring how different DPO qualities of a DPO dataset impact models trained on these datasets. ### Direct Use The 'default' config can be used for training DPO models. The "chosen" and "rejected" columns contain the highest-quality and lowest-quality generations respectively. You may, however, want to filter the dataset in other ways to explore how different qualities of a DPO dataset impact the resulting model. ### Out-of-Scope Use This dataset was constructed with a rather narrow goal in mind. It is unlikely to be useful for other tasks. However, it may be useful as a test bed for exploring how different qualities of a DPO dataset impact the resulting model. ## Dataset Structure The dataset consists of a few different configurations: - 'default': this is likely to be the most useful one for most users. It contains the highest-quality and lowest-quality generations in the "chosen" and "rejected" columns respectively. It also contains the "difference_in_score" column which is the difference between the score of the highest-quality generation and the lowest-quality generation. This column can be used to filter the dataset to explore how different qualities of a DPO dataset impact the resulting model. The 'default' configuration has the following columns: - 'question': the prompt requesting a haiku - 'generation_model': the name of the model used to generate the haiku - 'generation_prompt': the full prompt used to generate the haiku - 'generations': the haikus generated by the model - 'scores': the scores for each of the haikus - 'chosen': the highest-quality haiku - 'chosen_score': the score for the highest-quality haiku - 'rejected': the lowest-quality haiku - 'rejected_score': the score for the lowest-quality haiku - 'tie': whether the highest-quality and lowest-quality haikus have the same score - 'difference_in_score': the difference between the score of the highest-quality generation and the lowest-quality generation - 'system': the system prompt used during generation The 'default' configuration removes ties and ensures the lowest quality generation has a score < below 3. More information on the scoring process is outlined below. The 'rule_ranked' configuration is similar to the 'default' configuration but it has not been filtered at all so will give you more scope for things like including ties in your dataset. ## Dataset Creation This dataset was generated using the distilabel library using teknium's OpenHermes-2.5-Mistral-7B model. The prompts were generated from a seed list of terms and an adapted version of the SELF-INSTRUCT papers prompting strategy. You can see more details about the process of generating these prompts in the associated dataset davanstrien/haiku_prompts. From these initial prompts, multiple generations of haiku were generated (again using teknium's OpenHermes-2.5-Mistral-7B model). These generations were then scored using a rule-based system. This rule system scored haikus out of a 4, with the following approach to scoring: If the haiku is not three lines it scores zero. Then for each line, 1 point is deducted if the line does not match the expected syllable count for that line. This means a haiku with three lines matching the traditional 5-7-5 syllable structure will score 4. A haiku with one line with an incorrect syllable count will score 3. The rule-based system is not perfect and there are some cases where it will incorrectly score a haiku. However, it is relatively easy to understand and it is relatively easy to score a haiku manually so it is a good candidate for a rule-based system. The code for this is shared in this GitHub repository. ### Curation Rationale The dataset was curated with the following goals in mind: - to explore the process of using open models to generate synthetic datasets - to explore the use of rules for ranking generations - to explore how different slices of a DPO dataset impact the resulting model ### Source Data #### Data Collection and Processing See above for the process of generating the data. #### Who are the source data producers? Almost all of the data is synthetic. The prompts were generated using a seed list of terms and an adapted version of the SELF-INSTRUCT papers prompting strategy. The generations were generated using teknium's OpenHermes-2.5-Mistral-7B model. The scores were generated using a rule-based system. The initial prompt seed terms were generated by Daniel van Strien with some help from GPT-4. ### Annotations There are no traditional annotations in this dataset. However, the scores are generated using a rule-based system. #### Personal and Sensitive Information It is very unlikely that this dataset contains any personal or sensitive information, but if you find any prompts that you believe to be harmful, please open a discussion and I will remove them from the dataset. ## Bias, Risks, and Limitations Whilst I have not found any harmful prompts in the dataset, I have not manually validated all of the prompts. If you find any prompts which you believe to be harmful, please open a discussion and I will remove them from the dataset. ### Recommendations The original seed prompts used to generate this dataset are by no means comprehensive, and the dataset is likely to be biased toward the topics covered by the seed prompts. This dataset will likely develop over time. If you have any suggestions for additional seed prompts, please open a discussion and I will add them to the dataset. [optional] I have zero expectation that this dataset will be cited, but if you do use it in your work, you can cite it as follows: BibTeX: ## Glossary - DPO/Direct Preference Optimization: Introduced in *Direct Preference Optimization: Your Language Model is Secretly a Reward Model* - SELF-INSTRUCT: A prompting strategy introduced in *Self-Instruct: Aligning Language Model with Self Generated Instructions* ## Dataset Card Authors davanstrien ## Dataset Card Contact davanstrien
[ "# Dataset Card for Haiku DPO\n<img src=\"URL alt=\"Built with Distilabel\" width=\"200\" height=\"32\"/>\n\n\nThis a synthetic dataset of haikus. The dataset is constructed with the goal of helping to train LLMs to be more 'technically' competent at writing haikus.", "## Dataset Details\n\n The data consists of a few different components that are described in more detail below but the key components are:\n- a column of synthetically generated user prompts requesting a haiku\n- a column consisting of multiple responses to this prompt, generated by a language model\n- a column consisting of scores for each of these responses, generated by a rule-based system\n\nThe goal of this dataset was to help the author explore the process of synthesizing a dataset for DPO and to explore the extent to which DPO can be used to capture aesthetic preferences in language generation. \n\nHaiku also has the nice property of being relatively easy to score on a 'technical basis' i.e. do they follow the 5-7-5 syllable structure? As a result of this property, some relatively simple Python functions can be used to rate the technical quality of a haiku. \n\nBy focusing on a narrower task, this dataset also intends to offer a place to explore questions such as:\n- should DPO datasets prioritize a large gap in scores between the 'best' and 'worst' generations?\n- Is more data better or is a bigger gap in scores better?\n\nI am also interested in exploring the extent to which smaller models can learn to perform well at a narrower task. Again, haiku writing here is a good candidate for this exploration as it is relatively narrow, the data is cheaper to generate and it is relatively easy to score on a technical basis so you don't need to rely on human annotation or a \"judge\" LM to score the generations.", "### Dataset Description\n\n- Curated by: Daniel van Strien\n- Language(s) (NLP): English (synthetically generated)\n- License: Creative Commons Attribution 4.0 International License", "## Uses\n\nThis dataset can be used \"as is\" to help train LLMs to be more 'technically' competent at writing haikus. However, it is also intended as a \"test bed\" for exploring how different DPO qualities of a DPO dataset impact models trained on these datasets.", "### Direct Use\n\nThe 'default' config can be used for training DPO models. The \"chosen\" and \"rejected\" columns contain the highest-quality and lowest-quality generations respectively. You may, however, want to filter the dataset in other ways to explore how different qualities of a DPO dataset impact the resulting model.", "### Out-of-Scope Use\n\nThis dataset was constructed with a rather narrow goal in mind. It is unlikely to be useful for other tasks. However, it may be useful as a test bed for exploring how different qualities of a DPO dataset impact the resulting model.", "## Dataset Structure\n\nThe dataset consists of a few different configurations:\n\n- 'default': this is likely to be the most useful one for most users. It contains the highest-quality and lowest-quality generations in the \"chosen\" and \"rejected\" columns respectively. It also contains the \"difference_in_score\" column which is the difference between the score of the highest-quality generation and the lowest-quality generation. This column can be used to filter the dataset to explore how different qualities of a DPO dataset impact the resulting model.\n\nThe 'default' configuration has the following columns:\n- 'question': the prompt requesting a haiku\n- 'generation_model': the name of the model used to generate the haiku\n- 'generation_prompt': the full prompt used to generate the haiku\n- 'generations': the haikus generated by the model\n- 'scores': the scores for each of the haikus\n- 'chosen': the highest-quality haiku\n- 'chosen_score': the score for the highest-quality haiku\n- 'rejected': the lowest-quality haiku\n- 'rejected_score': the score for the lowest-quality haiku\n- 'tie': whether the highest-quality and lowest-quality haikus have the same score\n- 'difference_in_score': the difference between the score of the highest-quality generation and the lowest-quality generation\n- 'system': the system prompt used during generation\n\nThe 'default' configuration removes ties and ensures the lowest quality generation has a score < below 3. More information on the scoring process is outlined below.\n\nThe 'rule_ranked' configuration is similar to the 'default' configuration but it has not been filtered at all so will give you more scope for things like including ties in your dataset.", "## Dataset Creation\n\nThis dataset was generated using the distilabel library using teknium's OpenHermes-2.5-Mistral-7B model. The prompts were generated from a seed list of terms and an adapted version of the SELF-INSTRUCT papers prompting strategy. You can see more details about the process of generating these prompts in the associated dataset davanstrien/haiku_prompts. \n\nFrom these initial prompts, multiple generations of haiku were generated (again using teknium's OpenHermes-2.5-Mistral-7B model). These generations were then scored using a rule-based system. This rule system scored haikus out of a 4, with the following approach to scoring:\n\nIf the haiku is not three lines it scores zero. Then for each line, 1 point is deducted if the line does not match the expected syllable count for that line. This means a haiku with three lines matching the traditional 5-7-5 syllable structure will score 4. A haiku with one line with an incorrect syllable count will score 3. \n\nThe rule-based system is not perfect and there are some cases where it will incorrectly score a haiku. However, it is relatively easy to understand and it is relatively easy to score a haiku manually so it is a good candidate for a rule-based system. The code for this is shared\nin this GitHub repository.", "### Curation Rationale\n\nThe dataset was curated with the following goals in mind:\n- to explore the process of using open models to generate synthetic datasets\n- to explore the use of rules for ranking generations\n- to explore how different slices of a DPO dataset impact the resulting model", "### Source Data", "#### Data Collection and Processing\n\nSee above for the process of generating the data.", "#### Who are the source data producers?\n\nAlmost all of the data is synthetic. The prompts were generated using a seed list of terms and an adapted version of the SELF-INSTRUCT papers prompting strategy. The generations were generated using teknium's OpenHermes-2.5-Mistral-7B model. The scores were generated using a rule-based system. The initial prompt seed terms were generated by Daniel van Strien with some help from GPT-4.", "### Annotations \n\nThere are no traditional annotations in this dataset. However, the scores are generated using a rule-based system.", "#### Personal and Sensitive Information\n\nIt is very unlikely that this dataset contains any personal or sensitive information, but if you find any prompts that you believe to be harmful, please open a discussion and I will remove them from the dataset.", "## Bias, Risks, and Limitations\n\n\n\nWhilst I have not found any harmful prompts in the dataset, I have not manually validated all of the prompts. If you find any prompts which you believe to be harmful, please open a discussion and I will remove them from the dataset.", "### Recommendations\n\n\n\nThe original seed prompts used to generate this dataset are by no means comprehensive, and the dataset is likely to be biased toward the topics covered by the seed prompts. This dataset will likely develop over time. If you have any suggestions for additional seed prompts, please open a discussion and I will add them to the dataset.\n\n[optional]\n\nI have zero expectation that this dataset will be cited, but if you do use it in your work, you can cite it as follows:\n\nBibTeX:", "## Glossary \n\n\n\n- DPO/Direct Preference Optimization: Introduced in *Direct Preference Optimization: Your Language Model is Secretly a Reward Model*\n- SELF-INSTRUCT: A prompting strategy introduced in *Self-Instruct: Aligning Language Model with Self Generated Instructions*", "## Dataset Card Authors\n\ndavanstrien", "## Dataset Card Contact\n\ndavanstrien" ]
[ "TAGS\n#task_categories-text-generation #task_categories-reinforcement-learning #task_categories-conversational #size_categories-1K<n<10K #license-cc-by-4.0 #dpo #poetry #synthetic #arxiv-2212.10560 #arxiv-2110.00482 #arxiv-2305.18290 #region-us \n", "# Dataset Card for Haiku DPO\n<img src=\"URL alt=\"Built with Distilabel\" width=\"200\" height=\"32\"/>\n\n\nThis a synthetic dataset of haikus. The dataset is constructed with the goal of helping to train LLMs to be more 'technically' competent at writing haikus.", "## Dataset Details\n\n The data consists of a few different components that are described in more detail below but the key components are:\n- a column of synthetically generated user prompts requesting a haiku\n- a column consisting of multiple responses to this prompt, generated by a language model\n- a column consisting of scores for each of these responses, generated by a rule-based system\n\nThe goal of this dataset was to help the author explore the process of synthesizing a dataset for DPO and to explore the extent to which DPO can be used to capture aesthetic preferences in language generation. \n\nHaiku also has the nice property of being relatively easy to score on a 'technical basis' i.e. do they follow the 5-7-5 syllable structure? As a result of this property, some relatively simple Python functions can be used to rate the technical quality of a haiku. \n\nBy focusing on a narrower task, this dataset also intends to offer a place to explore questions such as:\n- should DPO datasets prioritize a large gap in scores between the 'best' and 'worst' generations?\n- Is more data better or is a bigger gap in scores better?\n\nI am also interested in exploring the extent to which smaller models can learn to perform well at a narrower task. Again, haiku writing here is a good candidate for this exploration as it is relatively narrow, the data is cheaper to generate and it is relatively easy to score on a technical basis so you don't need to rely on human annotation or a \"judge\" LM to score the generations.", "### Dataset Description\n\n- Curated by: Daniel van Strien\n- Language(s) (NLP): English (synthetically generated)\n- License: Creative Commons Attribution 4.0 International License", "## Uses\n\nThis dataset can be used \"as is\" to help train LLMs to be more 'technically' competent at writing haikus. However, it is also intended as a \"test bed\" for exploring how different DPO qualities of a DPO dataset impact models trained on these datasets.", "### Direct Use\n\nThe 'default' config can be used for training DPO models. The \"chosen\" and \"rejected\" columns contain the highest-quality and lowest-quality generations respectively. You may, however, want to filter the dataset in other ways to explore how different qualities of a DPO dataset impact the resulting model.", "### Out-of-Scope Use\n\nThis dataset was constructed with a rather narrow goal in mind. It is unlikely to be useful for other tasks. However, it may be useful as a test bed for exploring how different qualities of a DPO dataset impact the resulting model.", "## Dataset Structure\n\nThe dataset consists of a few different configurations:\n\n- 'default': this is likely to be the most useful one for most users. It contains the highest-quality and lowest-quality generations in the \"chosen\" and \"rejected\" columns respectively. It also contains the \"difference_in_score\" column which is the difference between the score of the highest-quality generation and the lowest-quality generation. This column can be used to filter the dataset to explore how different qualities of a DPO dataset impact the resulting model.\n\nThe 'default' configuration has the following columns:\n- 'question': the prompt requesting a haiku\n- 'generation_model': the name of the model used to generate the haiku\n- 'generation_prompt': the full prompt used to generate the haiku\n- 'generations': the haikus generated by the model\n- 'scores': the scores for each of the haikus\n- 'chosen': the highest-quality haiku\n- 'chosen_score': the score for the highest-quality haiku\n- 'rejected': the lowest-quality haiku\n- 'rejected_score': the score for the lowest-quality haiku\n- 'tie': whether the highest-quality and lowest-quality haikus have the same score\n- 'difference_in_score': the difference between the score of the highest-quality generation and the lowest-quality generation\n- 'system': the system prompt used during generation\n\nThe 'default' configuration removes ties and ensures the lowest quality generation has a score < below 3. More information on the scoring process is outlined below.\n\nThe 'rule_ranked' configuration is similar to the 'default' configuration but it has not been filtered at all so will give you more scope for things like including ties in your dataset.", "## Dataset Creation\n\nThis dataset was generated using the distilabel library using teknium's OpenHermes-2.5-Mistral-7B model. The prompts were generated from a seed list of terms and an adapted version of the SELF-INSTRUCT papers prompting strategy. You can see more details about the process of generating these prompts in the associated dataset davanstrien/haiku_prompts. \n\nFrom these initial prompts, multiple generations of haiku were generated (again using teknium's OpenHermes-2.5-Mistral-7B model). These generations were then scored using a rule-based system. This rule system scored haikus out of a 4, with the following approach to scoring:\n\nIf the haiku is not three lines it scores zero. Then for each line, 1 point is deducted if the line does not match the expected syllable count for that line. This means a haiku with three lines matching the traditional 5-7-5 syllable structure will score 4. A haiku with one line with an incorrect syllable count will score 3. \n\nThe rule-based system is not perfect and there are some cases where it will incorrectly score a haiku. However, it is relatively easy to understand and it is relatively easy to score a haiku manually so it is a good candidate for a rule-based system. The code for this is shared\nin this GitHub repository.", "### Curation Rationale\n\nThe dataset was curated with the following goals in mind:\n- to explore the process of using open models to generate synthetic datasets\n- to explore the use of rules for ranking generations\n- to explore how different slices of a DPO dataset impact the resulting model", "### Source Data", "#### Data Collection and Processing\n\nSee above for the process of generating the data.", "#### Who are the source data producers?\n\nAlmost all of the data is synthetic. The prompts were generated using a seed list of terms and an adapted version of the SELF-INSTRUCT papers prompting strategy. The generations were generated using teknium's OpenHermes-2.5-Mistral-7B model. The scores were generated using a rule-based system. The initial prompt seed terms were generated by Daniel van Strien with some help from GPT-4.", "### Annotations \n\nThere are no traditional annotations in this dataset. However, the scores are generated using a rule-based system.", "#### Personal and Sensitive Information\n\nIt is very unlikely that this dataset contains any personal or sensitive information, but if you find any prompts that you believe to be harmful, please open a discussion and I will remove them from the dataset.", "## Bias, Risks, and Limitations\n\n\n\nWhilst I have not found any harmful prompts in the dataset, I have not manually validated all of the prompts. If you find any prompts which you believe to be harmful, please open a discussion and I will remove them from the dataset.", "### Recommendations\n\n\n\nThe original seed prompts used to generate this dataset are by no means comprehensive, and the dataset is likely to be biased toward the topics covered by the seed prompts. This dataset will likely develop over time. If you have any suggestions for additional seed prompts, please open a discussion and I will add them to the dataset.\n\n[optional]\n\nI have zero expectation that this dataset will be cited, but if you do use it in your work, you can cite it as follows:\n\nBibTeX:", "## Glossary \n\n\n\n- DPO/Direct Preference Optimization: Introduced in *Direct Preference Optimization: Your Language Model is Secretly a Reward Model*\n- SELF-INSTRUCT: A prompting strategy introduced in *Self-Instruct: Aligning Language Model with Self Generated Instructions*", "## Dataset Card Authors\n\ndavanstrien", "## Dataset Card Contact\n\ndavanstrien" ]
6a3f19e0451ab9019a51b677cf557d4aeacb0a11
# Dataset for SemEval-2024 Task 3 The dataset for [SemEval-2024 Task 3: The Competition of Multimodal Emotion Cause Analysis in Conversations](https://nustm.github.io/SemEval-2024_ECAC/) is released here. ## File Description ``` SemEval-2024_Task3 |-- README.md |-- training_data | |-- Subtask_1_train.json | |-- Subtask_2_train.json | |-- test.tar.gz │ │ ├── dia1utt1.mp4 │ │ ├── dia1utt2.mp4 │ │ ├── ... | |-- train.tar.gz | `-- valid.tar.gz `-- trial_data | |-- Subtask_1_trial.json | |-- Subtask_2_trial.json | `-- video_trial.zip │ │ ├── diaasdda.mp4 │ │ ├── digdfgdr.mp4 │ │ ├── ... ```
dim/SemEvalSubtask2
[ "region:us" ]
2024-01-11T13:38:53+00:00
{}
2024-01-17T19:56:56+00:00
[]
[]
TAGS #region-us
# Dataset for SemEval-2024 Task 3 The dataset for SemEval-2024 Task 3: The Competition of Multimodal Emotion Cause Analysis in Conversations is released here. ## File Description
[ "# Dataset for SemEval-2024 Task 3\n\nThe dataset for SemEval-2024 Task 3: The Competition of Multimodal Emotion Cause Analysis in Conversations is released here.", "## File Description" ]
[ "TAGS\n#region-us \n", "# Dataset for SemEval-2024 Task 3\n\nThe dataset for SemEval-2024 Task 3: The Competition of Multimodal Emotion Cause Analysis in Conversations is released here.", "## File Description" ]
48541bb8dd346bcac760d756fbc7db79e363cdab
# Dataset Card for Evaluation run of NLPinas/yi-bagel-2x34b-moe <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [NLPinas/yi-bagel-2x34b-moe](https://huggingface.co/NLPinas/yi-bagel-2x34b-moe) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_NLPinas__yi-bagel-2x34b-moe", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T13:51:44.686657](https://huggingface.co/datasets/open-llm-leaderboard/details_NLPinas__yi-bagel-2x34b-moe/blob/main/results_2024-01-11T13-51-44.686657.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7615183276503468, "acc_stderr": 0.02832471118128543, "acc_norm": 0.7668016554766149, "acc_norm_stderr": 0.028849292688075817, "mc1": 0.5642594859241126, "mc1_stderr": 0.01735834539886313, "mc2": 0.7142422056307771, "mc2_stderr": 0.014238871538897193 }, "harness|arc:challenge|25": { "acc": 0.6962457337883959, "acc_stderr": 0.013438909184778762, "acc_norm": 0.726962457337884, "acc_norm_stderr": 0.013019332762635748 }, "harness|hellaswag|10": { "acc": 0.6620195180242979, "acc_stderr": 0.0047205513235471265, "acc_norm": 0.8544114718183629, "acc_norm_stderr": 0.0035197241633108875 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7481481481481481, "acc_stderr": 0.03749850709174021, "acc_norm": 0.7481481481481481, "acc_norm_stderr": 0.03749850709174021 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.868421052631579, "acc_stderr": 0.027508689533549912, "acc_norm": 0.868421052631579, "acc_norm_stderr": 0.027508689533549912 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.78, "acc_stderr": 0.04163331998932262, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932262 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.8075471698113208, "acc_stderr": 0.024262979839372274, "acc_norm": 0.8075471698113208, "acc_norm_stderr": 0.024262979839372274 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8958333333333334, "acc_stderr": 0.025545239210256917, "acc_norm": 0.8958333333333334, "acc_norm_stderr": 0.025545239210256917 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.64, "acc_stderr": 0.048241815132442176, "acc_norm": 0.64, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7456647398843931, "acc_stderr": 0.0332055644308557, "acc_norm": 0.7456647398843931, "acc_norm_stderr": 0.0332055644308557 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5784313725490197, "acc_stderr": 0.04913595201274503, "acc_norm": 0.5784313725490197, "acc_norm_stderr": 0.04913595201274503 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.81, "acc_stderr": 0.039427724440366234, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7702127659574468, "acc_stderr": 0.027501752944412417, "acc_norm": 0.7702127659574468, "acc_norm_stderr": 0.027501752944412417 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5877192982456141, "acc_stderr": 0.04630653203366596, "acc_norm": 0.5877192982456141, "acc_norm_stderr": 0.04630653203366596 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7241379310344828, "acc_stderr": 0.037245636197746304, "acc_norm": 0.7241379310344828, "acc_norm_stderr": 0.037245636197746304 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.716931216931217, "acc_stderr": 0.023201392938194974, "acc_norm": 0.716931216931217, "acc_norm_stderr": 0.023201392938194974 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.6111111111111112, "acc_stderr": 0.04360314860077459, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.04360314860077459 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.9032258064516129, "acc_stderr": 0.016818943416345197, "acc_norm": 0.9032258064516129, "acc_norm_stderr": 0.016818943416345197 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6403940886699507, "acc_stderr": 0.03376458246509567, "acc_norm": 0.6403940886699507, "acc_norm_stderr": 0.03376458246509567 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.8, "acc_stderr": 0.04020151261036846, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8606060606060606, "acc_stderr": 0.027045948825865394, "acc_norm": 0.8606060606060606, "acc_norm_stderr": 0.027045948825865394 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9191919191919192, "acc_stderr": 0.019417681889724536, "acc_norm": 0.9191919191919192, "acc_norm_stderr": 0.019417681889724536 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9689119170984456, "acc_stderr": 0.012525310625527033, "acc_norm": 0.9689119170984456, "acc_norm_stderr": 0.012525310625527033 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.8179487179487179, "acc_stderr": 0.0195652367829309, "acc_norm": 0.8179487179487179, "acc_norm_stderr": 0.0195652367829309 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.4703703703703704, "acc_stderr": 0.030431963547936584, "acc_norm": 0.4703703703703704, "acc_norm_stderr": 0.030431963547936584 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8235294117647058, "acc_stderr": 0.02476290267805791, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.02476290267805791 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.4966887417218543, "acc_stderr": 0.04082393379449654, "acc_norm": 0.4966887417218543, "acc_norm_stderr": 0.04082393379449654 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9137614678899083, "acc_stderr": 0.012035597300116245, "acc_norm": 0.9137614678899083, "acc_norm_stderr": 0.012035597300116245 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.0321495214780275, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.0321495214780275 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9166666666666666, "acc_stderr": 0.019398452135813905, "acc_norm": 0.9166666666666666, "acc_norm_stderr": 0.019398452135813905 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9071729957805907, "acc_stderr": 0.01888975055095671, "acc_norm": 0.9071729957805907, "acc_norm_stderr": 0.01888975055095671 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.8026905829596412, "acc_stderr": 0.02670985334496796, "acc_norm": 0.8026905829596412, "acc_norm_stderr": 0.02670985334496796 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8702290076335878, "acc_stderr": 0.029473649496907065, "acc_norm": 0.8702290076335878, "acc_norm_stderr": 0.029473649496907065 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8925619834710744, "acc_stderr": 0.028268812192540637, "acc_norm": 0.8925619834710744, "acc_norm_stderr": 0.028268812192540637 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8981481481481481, "acc_stderr": 0.02923927267563275, "acc_norm": 0.8981481481481481, "acc_norm_stderr": 0.02923927267563275 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8650306748466258, "acc_stderr": 0.026845765054553838, "acc_norm": 0.8650306748466258, "acc_norm_stderr": 0.026845765054553838 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5178571428571429, "acc_stderr": 0.047427623612430116, "acc_norm": 0.5178571428571429, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.8737864077669902, "acc_stderr": 0.03288180278808628, "acc_norm": 0.8737864077669902, "acc_norm_stderr": 0.03288180278808628 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9444444444444444, "acc_stderr": 0.015006312806446912, "acc_norm": 0.9444444444444444, "acc_norm_stderr": 0.015006312806446912 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.9, "acc_stderr": 0.03015113445777634, "acc_norm": 0.9, "acc_norm_stderr": 0.03015113445777634 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9016602809706258, "acc_stderr": 0.010648356301876338, "acc_norm": 0.9016602809706258, "acc_norm_stderr": 0.010648356301876338 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.815028901734104, "acc_stderr": 0.02090397584208303, "acc_norm": 0.815028901734104, "acc_norm_stderr": 0.02090397584208303 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.7988826815642458, "acc_stderr": 0.013405946402609049, "acc_norm": 0.7988826815642458, "acc_norm_stderr": 0.013405946402609049 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.8529411764705882, "acc_stderr": 0.020279402936174588, "acc_norm": 0.8529411764705882, "acc_norm_stderr": 0.020279402936174588 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8135048231511254, "acc_stderr": 0.022122439772480768, "acc_norm": 0.8135048231511254, "acc_norm_stderr": 0.022122439772480768 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8672839506172839, "acc_stderr": 0.018877353839571842, "acc_norm": 0.8672839506172839, "acc_norm_stderr": 0.018877353839571842 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.624113475177305, "acc_stderr": 0.028893955412115875, "acc_norm": 0.624113475177305, "acc_norm_stderr": 0.028893955412115875 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5808344198174706, "acc_stderr": 0.012602244505788224, "acc_norm": 0.5808344198174706, "acc_norm_stderr": 0.012602244505788224 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8308823529411765, "acc_stderr": 0.022770868010113018, "acc_norm": 0.8308823529411765, "acc_norm_stderr": 0.022770868010113018 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.8120915032679739, "acc_stderr": 0.015803565736776694, "acc_norm": 0.8120915032679739, "acc_norm_stderr": 0.015803565736776694 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7363636363636363, "acc_stderr": 0.04220224692971987, "acc_norm": 0.7363636363636363, "acc_norm_stderr": 0.04220224692971987 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8285714285714286, "acc_stderr": 0.02412746346265015, "acc_norm": 0.8285714285714286, "acc_norm_stderr": 0.02412746346265015 }, "harness|hendrycksTest-sociology|5": { "acc": 0.900497512437811, "acc_stderr": 0.021166216304659393, "acc_norm": 0.900497512437811, "acc_norm_stderr": 0.021166216304659393 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.89, "acc_stderr": 0.03144660377352203, "acc_norm": 0.89, "acc_norm_stderr": 0.03144660377352203 }, "harness|hendrycksTest-virology|5": { "acc": 0.5843373493975904, "acc_stderr": 0.03836722176598053, "acc_norm": 0.5843373493975904, "acc_norm_stderr": 0.03836722176598053 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8888888888888888, "acc_stderr": 0.02410338420207286, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.02410338420207286 }, "harness|truthfulqa:mc|0": { "mc1": 0.5642594859241126, "mc1_stderr": 0.01735834539886313, "mc2": 0.7142422056307771, "mc2_stderr": 0.014238871538897193 }, "harness|winogrande|5": { "acc": 0.8271507498026835, "acc_stderr": 0.010626964529971868 }, "harness|gsm8k|5": { "acc": 0.6072782410917361, "acc_stderr": 0.013451745349586576 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_NLPinas__yi-bagel-2x34b-moe
[ "region:us" ]
2024-01-11T13:53:56+00:00
{"pretty_name": "Evaluation run of NLPinas/yi-bagel-2x34b-moe", "dataset_summary": "Dataset automatically created during the evaluation run of model [NLPinas/yi-bagel-2x34b-moe](https://huggingface.co/NLPinas/yi-bagel-2x34b-moe) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NLPinas__yi-bagel-2x34b-moe\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T13:51:44.686657](https://huggingface.co/datasets/open-llm-leaderboard/details_NLPinas__yi-bagel-2x34b-moe/blob/main/results_2024-01-11T13-51-44.686657.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7615183276503468,\n \"acc_stderr\": 0.02832471118128543,\n \"acc_norm\": 0.7668016554766149,\n \"acc_norm_stderr\": 0.028849292688075817,\n \"mc1\": 0.5642594859241126,\n \"mc1_stderr\": 0.01735834539886313,\n \"mc2\": 0.7142422056307771,\n \"mc2_stderr\": 0.014238871538897193\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6962457337883959,\n \"acc_stderr\": 0.013438909184778762,\n \"acc_norm\": 0.726962457337884,\n \"acc_norm_stderr\": 0.013019332762635748\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6620195180242979,\n \"acc_stderr\": 0.0047205513235471265,\n \"acc_norm\": 0.8544114718183629,\n \"acc_norm_stderr\": 0.0035197241633108875\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7481481481481481,\n \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.7481481481481481,\n \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.868421052631579,\n \"acc_stderr\": 0.027508689533549912,\n \"acc_norm\": 0.868421052631579,\n \"acc_norm_stderr\": 0.027508689533549912\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372274,\n \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372274\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.04913595201274503,\n \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.04913595201274503\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7702127659574468,\n \"acc_stderr\": 0.027501752944412417,\n \"acc_norm\": 0.7702127659574468,\n \"acc_norm_stderr\": 0.027501752944412417\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.037245636197746304,\n \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.037245636197746304\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.716931216931217,\n \"acc_stderr\": 0.023201392938194974,\n \"acc_norm\": 0.716931216931217,\n \"acc_norm_stderr\": 0.023201392938194974\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9032258064516129,\n \"acc_stderr\": 0.016818943416345197,\n \"acc_norm\": 0.9032258064516129,\n \"acc_norm_stderr\": 0.016818943416345197\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6403940886699507,\n \"acc_stderr\": 0.03376458246509567,\n \"acc_norm\": 0.6403940886699507,\n \"acc_norm_stderr\": 0.03376458246509567\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536,\n \"acc_norm\": 0.9191919191919192,\n \"acc_norm_stderr\": 0.019417681889724536\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8179487179487179,\n \"acc_stderr\": 0.0195652367829309,\n \"acc_norm\": 0.8179487179487179,\n \"acc_norm_stderr\": 0.0195652367829309\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4703703703703704,\n \"acc_stderr\": 0.030431963547936584,\n \"acc_norm\": 0.4703703703703704,\n \"acc_norm_stderr\": 0.030431963547936584\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02476290267805791,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02476290267805791\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449654,\n \"acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449654\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9137614678899083,\n \"acc_stderr\": 0.012035597300116245,\n \"acc_norm\": 0.9137614678899083,\n \"acc_norm_stderr\": 0.012035597300116245\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0321495214780275,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0321495214780275\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540637,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540637\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.02923927267563275,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.02923927267563275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553838,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553838\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n \"acc_stderr\": 0.015006312806446912,\n \"acc_norm\": 0.9444444444444444,\n \"acc_norm_stderr\": 0.015006312806446912\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.03015113445777634,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.03015113445777634\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9016602809706258,\n \"acc_stderr\": 0.010648356301876338,\n \"acc_norm\": 0.9016602809706258,\n \"acc_norm_stderr\": 0.010648356301876338\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.815028901734104,\n \"acc_stderr\": 0.02090397584208303,\n \"acc_norm\": 0.815028901734104,\n \"acc_norm_stderr\": 0.02090397584208303\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7988826815642458,\n \"acc_stderr\": 0.013405946402609049,\n \"acc_norm\": 0.7988826815642458,\n \"acc_norm_stderr\": 0.013405946402609049\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.020279402936174588,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.020279402936174588\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8135048231511254,\n \"acc_stderr\": 0.022122439772480768,\n \"acc_norm\": 0.8135048231511254,\n \"acc_norm_stderr\": 0.022122439772480768\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.018877353839571842,\n \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.018877353839571842\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.624113475177305,\n \"acc_stderr\": 0.028893955412115875,\n \"acc_norm\": 0.624113475177305,\n \"acc_norm_stderr\": 0.028893955412115875\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5808344198174706,\n \"acc_stderr\": 0.012602244505788224,\n \"acc_norm\": 0.5808344198174706,\n \"acc_norm_stderr\": 0.012602244505788224\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8308823529411765,\n \"acc_stderr\": 0.022770868010113018,\n \"acc_norm\": 0.8308823529411765,\n \"acc_norm_stderr\": 0.022770868010113018\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8120915032679739,\n \"acc_stderr\": 0.015803565736776694,\n \"acc_norm\": 0.8120915032679739,\n \"acc_norm_stderr\": 0.015803565736776694\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.02412746346265015,\n \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.02412746346265015\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.021166216304659393,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.021166216304659393\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.02410338420207286,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02410338420207286\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5642594859241126,\n \"mc1_stderr\": 0.01735834539886313,\n \"mc2\": 0.7142422056307771,\n \"mc2_stderr\": 0.014238871538897193\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8271507498026835,\n \"acc_stderr\": 0.010626964529971868\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6072782410917361,\n \"acc_stderr\": 0.013451745349586576\n }\n}\n```", "repo_url": "https://huggingface.co/NLPinas/yi-bagel-2x34b-moe", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|arc:challenge|25_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|gsm8k|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hellaswag|10_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T13-51-44.686657.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["**/details_harness|winogrande|5_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T13-51-44.686657.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T13_51_44.686657", "path": ["results_2024-01-11T13-51-44.686657.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T13-51-44.686657.parquet"]}]}]}
2024-01-11T13:54:22+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of NLPinas/yi-bagel-2x34b-moe Dataset automatically created during the evaluation run of model NLPinas/yi-bagel-2x34b-moe on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T13:51:44.686657(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of NLPinas/yi-bagel-2x34b-moe\n\n\n\nDataset automatically created during the evaluation run of model NLPinas/yi-bagel-2x34b-moe on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T13:51:44.686657(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of NLPinas/yi-bagel-2x34b-moe\n\n\n\nDataset automatically created during the evaluation run of model NLPinas/yi-bagel-2x34b-moe on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T13:51:44.686657(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
f6ab437a5d93c576e9f7451de2bd8c16d2a91c2d
# Instruct-Aira Dataset version 3.0 ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Repository:** https://github.com/Nkluge-correa/Aira - **Point of Contact:** [AIRES at PUCRS]([email protected]) ### Dataset Summary This dataset contains a collection of multi-turn conversations between an assistant and a user. Conversations were generated by user interactions with already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc). The dataset is available in Portuguese and English. ### Supported Tasks and Leaderboards This dataset can be utilized for various natural language processing tasks, including but not limited to: - Language modeling. - Question-answering systems. - Chatbot development. - Evaluation of language models. - Alignment research. ### Languages English and Portuguese. ## Dataset Structure ### Data Instances The dataset consists of the following features: - **Conversation ID:** Identifier of the conversation. - **Conversations:** A list of dictionaries following a [chat format](https://github.com/huggingface/blog/blob/main/chat-templates.md). ### Data Fields ```python [ {'role': 'user', 'content': 'Hello! What is your name?'}, {'role': 'assistant', 'content': 'Hello! My name is Aira. How can I help you?'}, {'role': 'user', 'content': 'What is a language model, Aira?'}, {'role': 'assistant', 'content': 'A language model is a probability distribution over a vocabulary.'}, ] ``` ### Data Splits Available splits are `english` and `portuguese`. ```python from datasets import load_dataset dataset = load_dataset("nicholasKluge/instruct-aira-dataset-v3", split='portuguese') ``` ## Dataset Creation ### Curation Rationale This dataset was developed are part of [Nicholas Kluge's](https://nkluge-correa.github.io/) doctoral dissertation, "_Dynamic Normativity: Necessary and Sufficient Conditions for Value Alignment._" This research was funded by CNPq (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), FAPERGS (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), and DAAD (Deutscher Akademischer Austauschdienst), as part of a doctoral research project tied to Philosophy departments of PUCRS (Pontifícia Universidade Católica do Rio Grande do Sul) and the University of Bonn. ### Source Data #### Initial Data Collection and Normalization All completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets. #### Who are the source language producers? All completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets. ### Annotations #### Annotation process All completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets. #### Who are the annotators? No annotators were used. ### Personal and Sensitive Information No personal or sensitive information is part of this dataset. ## Considerations for Using the Data ### Social Impact of Dataset No considerations. ### Discussion of Biases No considerations. ### Other Known Limitations No considerations. ## Additional Information ### Dataset Curators [Nicholas Kluge Corrêa](mailto:[email protected]). ### Licensing Information This dataset is licensed under the [Apache License, version 2.0](LICENSE). ### Citation Information ```latex @misc{nicholas22aira, doi = {10.5281/zenodo.6989727}, url = {https://github.com/Nkluge-correa/Aira}, author = {Nicholas Kluge Corrêa}, title = {Aira}, year = {2023}, publisher = {GitHub}, journal = {GitHub repository}, } ``` ### Contributions If you would like to contribute, contact me at [[email protected]](mailto:[email protected])!
nicholasKluge/instruct-aira-dataset-v3
[ "task_categories:conversational", "task_categories:text-generation", "size_categories:10K<n<100K", "language:pt", "language:en", "license:apache-2.0", "alignment", "instruction", "chat", "region:us" ]
2024-01-11T14:08:33+00:00
{"language": ["pt", "en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["conversational", "text-generation"], "pretty_name": "Instruct-Aira Dataset version 3.0", "tags": ["alignment", "instruction", "chat"], "dataset_info": {"features": [{"name": "conversation_id", "dtype": "string"}, {"name": "conversations", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "portuguese", "num_bytes": 348823623, "num_examples": 50000}, {"name": "english", "num_bytes": 317852173, "num_examples": 50000}], "download_size": 330840060, "dataset_size": 666675796}, "configs": [{"config_name": "default", "data_files": [{"split": "portuguese", "path": "data/portuguese-*"}, {"split": "english", "path": "data/english-*"}]}]}
2024-02-15T18:13:13+00:00
[]
[ "pt", "en" ]
TAGS #task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-Portuguese #language-English #license-apache-2.0 #alignment #instruction #chat #region-us
# Instruct-Aira Dataset version 3.0 ## Table of Contents - Table of Contents - Dataset Description - Dataset Summary - Supported Tasks and Leaderboards - Languages - Dataset Structure - Data Instances - Data Fields - Data Splits - Dataset Creation - Curation Rationale - Source Data - Annotations - Personal and Sensitive Information - Considerations for Using the Data - Social Impact of Dataset - Discussion of Biases - Other Known Limitations - Additional Information - Dataset Curators - Licensing Information - Citation Information - Contributions ## Dataset Description - Repository: URL - Point of Contact: AIRES at PUCRS ### Dataset Summary This dataset contains a collection of multi-turn conversations between an assistant and a user. Conversations were generated by user interactions with already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc). The dataset is available in Portuguese and English. ### Supported Tasks and Leaderboards This dataset can be utilized for various natural language processing tasks, including but not limited to: - Language modeling. - Question-answering systems. - Chatbot development. - Evaluation of language models. - Alignment research. ### Languages English and Portuguese. ## Dataset Structure ### Data Instances The dataset consists of the following features: - Conversation ID: Identifier of the conversation. - Conversations: A list of dictionaries following a chat format. ### Data Fields ### Data Splits Available splits are 'english' and 'portuguese'. ## Dataset Creation ### Curation Rationale This dataset was developed are part of Nicholas Kluge's doctoral dissertation, "_Dynamic Normativity: Necessary and Sufficient Conditions for Value Alignment._" This research was funded by CNPq (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), FAPERGS (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), and DAAD (Deutscher Akademischer Austauschdienst), as part of a doctoral research project tied to Philosophy departments of PUCRS (Pontifícia Universidade Católica do Rio Grande do Sul) and the University of Bonn. ### Source Data #### Initial Data Collection and Normalization All completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets. #### Who are the source language producers? All completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets. ### Annotations #### Annotation process All completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets. #### Who are the annotators? No annotators were used. ### Personal and Sensitive Information No personal or sensitive information is part of this dataset. ## Considerations for Using the Data ### Social Impact of Dataset No considerations. ### Discussion of Biases No considerations. ### Other Known Limitations No considerations. ## Additional Information ### Dataset Curators Nicholas Kluge Corrêa. ### Licensing Information This dataset is licensed under the Apache License, version 2.0. ### Contributions If you would like to contribute, contact me at nicholas@URL!
[ "# Instruct-Aira Dataset version 3.0", "## Table of Contents\n\n- Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions", "## Dataset Description\n\n- Repository: URL\n- Point of Contact: AIRES at PUCRS", "### Dataset Summary\n\nThis dataset contains a collection of multi-turn conversations between an assistant and a user. Conversations were generated by user interactions with already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc). The dataset is available in Portuguese and English.", "### Supported Tasks and Leaderboards\n\nThis dataset can be utilized for various natural language processing tasks, including but not limited to:\n\n- Language modeling.\n- Question-answering systems.\n- Chatbot development.\n- Evaluation of language models.\n- Alignment research.", "### Languages\n\nEnglish and Portuguese.", "## Dataset Structure", "### Data Instances\n\nThe dataset consists of the following features:\n\n- Conversation ID: Identifier of the conversation.\n- Conversations: A list of dictionaries following a chat format.", "### Data Fields", "### Data Splits\n\nAvailable splits are 'english' and 'portuguese'.", "## Dataset Creation", "### Curation Rationale\n\nThis dataset was developed are part of Nicholas Kluge's doctoral dissertation, \"_Dynamic Normativity: Necessary and Sufficient Conditions for Value Alignment._\" This research was funded by CNPq (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), FAPERGS (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), and DAAD (Deutscher Akademischer Austauschdienst), as part of a doctoral research project tied to Philosophy departments of PUCRS (Pontifícia Universidade Católica do Rio Grande do Sul) and the University of Bonn.", "### Source Data", "#### Initial Data Collection and Normalization\n\nAll completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.", "#### Who are the source language producers?\n\nAll completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.", "### Annotations", "#### Annotation process\n\nAll completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.", "#### Who are the annotators?\n\nNo annotators were used.", "### Personal and Sensitive Information\n\nNo personal or sensitive information is part of this dataset.", "## Considerations for Using the Data", "### Social Impact of Dataset\n\nNo considerations.", "### Discussion of Biases\n\nNo considerations.", "### Other Known Limitations\n\nNo considerations.", "## Additional Information", "### Dataset Curators\n\nNicholas Kluge Corrêa.", "### Licensing Information\n\nThis dataset is licensed under the Apache License, version 2.0.", "### Contributions\n\nIf you would like to contribute, contact me at nicholas@URL!" ]
[ "TAGS\n#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-Portuguese #language-English #license-apache-2.0 #alignment #instruction #chat #region-us \n", "# Instruct-Aira Dataset version 3.0", "## Table of Contents\n\n- Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions", "## Dataset Description\n\n- Repository: URL\n- Point of Contact: AIRES at PUCRS", "### Dataset Summary\n\nThis dataset contains a collection of multi-turn conversations between an assistant and a user. Conversations were generated by user interactions with already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc). The dataset is available in Portuguese and English.", "### Supported Tasks and Leaderboards\n\nThis dataset can be utilized for various natural language processing tasks, including but not limited to:\n\n- Language modeling.\n- Question-answering systems.\n- Chatbot development.\n- Evaluation of language models.\n- Alignment research.", "### Languages\n\nEnglish and Portuguese.", "## Dataset Structure", "### Data Instances\n\nThe dataset consists of the following features:\n\n- Conversation ID: Identifier of the conversation.\n- Conversations: A list of dictionaries following a chat format.", "### Data Fields", "### Data Splits\n\nAvailable splits are 'english' and 'portuguese'.", "## Dataset Creation", "### Curation Rationale\n\nThis dataset was developed are part of Nicholas Kluge's doctoral dissertation, \"_Dynamic Normativity: Necessary and Sufficient Conditions for Value Alignment._\" This research was funded by CNPq (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), FAPERGS (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), and DAAD (Deutscher Akademischer Austauschdienst), as part of a doctoral research project tied to Philosophy departments of PUCRS (Pontifícia Universidade Católica do Rio Grande do Sul) and the University of Bonn.", "### Source Data", "#### Initial Data Collection and Normalization\n\nAll completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.", "#### Who are the source language producers?\n\nAll completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.", "### Annotations", "#### Annotation process\n\nAll completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.", "#### Who are the annotators?\n\nNo annotators were used.", "### Personal and Sensitive Information\n\nNo personal or sensitive information is part of this dataset.", "## Considerations for Using the Data", "### Social Impact of Dataset\n\nNo considerations.", "### Discussion of Biases\n\nNo considerations.", "### Other Known Limitations\n\nNo considerations.", "## Additional Information", "### Dataset Curators\n\nNicholas Kluge Corrêa.", "### Licensing Information\n\nThis dataset is licensed under the Apache License, version 2.0.", "### Contributions\n\nIf you would like to contribute, contact me at nicholas@URL!" ]
f706d2c746990012a863d8b49dc02c0aa1fcfb12
# Dataset Card for "Diffusion-RFdiffusion_knots" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EvaKlimentova/Diffusion-RFdiffusion_knots
[ "region:us" ]
2024-01-11T14:13:58+00:00
{"dataset_info": {"features": [{"name": "ID", "dtype": "string"}, {"name": "Sequence", "dtype": "string"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1383863.6077145613, "num_examples": 4666}, {"name": "test", "num_bytes": 153927.39228543878, "num_examples": 519}], "download_size": 1312156, "dataset_size": 1537791.0}}
2024-01-16T07:23:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for "Diffusion-RFdiffusion_knots" More Information needed
[ "# Dataset Card for \"Diffusion-RFdiffusion_knots\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"Diffusion-RFdiffusion_knots\"\n\nMore Information needed" ]
db1b69528ae533f8ced5d644db95e18024dd7e99
# Dataset Card for "oasst1-chatml" This dataset is ChatML template added version of Open Assistant dataset, which you can find here: https://huggingface.co/datasets/OpenAssistant/oasst1/tree/main This dataset contains conversation tree of oasst1 dataset with ChatML format, with a total of 9,846 samples. For further information, please see the original dataset.
RaviNaik/oasst1-chatml
[ "task_categories:text-generation", "size_categories:1K<n<10K", "license:mit", "region:us" ]
2024-01-11T14:33:07+00:00
{"license": "mit", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"], "pretty_name": "OASST1-ChatML", "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 48375961, "num_examples": 9846}], "download_size": 25030536, "dataset_size": 48375961}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-11T14:52:40+00:00
[]
[]
TAGS #task_categories-text-generation #size_categories-1K<n<10K #license-mit #region-us
# Dataset Card for "oasst1-chatml" This dataset is ChatML template added version of Open Assistant dataset, which you can find here: URL This dataset contains conversation tree of oasst1 dataset with ChatML format, with a total of 9,846 samples. For further information, please see the original dataset.
[ "# Dataset Card for \"oasst1-chatml\"\n\nThis dataset is ChatML template added version of Open Assistant dataset, which you can find here: URL\n\nThis dataset contains conversation tree of oasst1 dataset with ChatML format, with a total of 9,846 samples.\n\nFor further information, please see the original dataset." ]
[ "TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #license-mit #region-us \n", "# Dataset Card for \"oasst1-chatml\"\n\nThis dataset is ChatML template added version of Open Assistant dataset, which you can find here: URL\n\nThis dataset contains conversation tree of oasst1 dataset with ChatML format, with a total of 9,846 samples.\n\nFor further information, please see the original dataset." ]
54fb62ed500e134b0302b03f7b2b0c878ee6e2d6
# Medical-Calgary-Cambridge-chat-37 37 chat/dialogue entries between a patient and doctor, using the Calgary-Cambridge model. Originally made for [`microsoft/phi-2`](https://huggingface.co/microsoft/phi-2) chat format.
kazcfz/Medical-Calgary-Cambridge-chat-37
[ "region:us" ]
2024-01-11T14:33:36+00:00
{}
2024-01-11T14:35:46+00:00
[]
[]
TAGS #region-us
# Medical-Calgary-Cambridge-chat-37 37 chat/dialogue entries between a patient and doctor, using the Calgary-Cambridge model. Originally made for 'microsoft/phi-2' chat format.
[ "# Medical-Calgary-Cambridge-chat-37\n37 chat/dialogue entries between a patient and doctor, using the Calgary-Cambridge model.\nOriginally made for 'microsoft/phi-2' chat format." ]
[ "TAGS\n#region-us \n", "# Medical-Calgary-Cambridge-chat-37\n37 chat/dialogue entries between a patient and doctor, using the Calgary-Cambridge model.\nOriginally made for 'microsoft/phi-2' chat format." ]
e0773ff28df5493570001d9bf1f2483948b6b79f
# Dataset Card for "dialog_data_train_hf" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Coooori/dialog_data_train_hf
[ "region:us" ]
2024-01-11T14:38:07+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1375302, "num_examples": 801}], "download_size": 729647, "dataset_size": 1375302}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-11T14:38:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for "dialog_data_train_hf" More Information needed
[ "# Dataset Card for \"dialog_data_train_hf\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"dialog_data_train_hf\"\n\nMore Information needed" ]
467c05588941b97c0334b193fdea91340c0eee09
# Dataset Card for "dialog_data_dev_hf" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Coooori/dialog_data_dev_hf
[ "region:us" ]
2024-01-11T14:38:14+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 166788, "num_examples": 99}], "download_size": 0, "dataset_size": 166788}}
2024-01-11T14:38:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for "dialog_data_dev_hf" More Information needed
[ "# Dataset Card for \"dialog_data_dev_hf\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"dialog_data_dev_hf\"\n\nMore Information needed" ]
4f4723fe7c388bfd2450f83ab8f76b399635f303
# Dataset Card for "dialog_data_test_hf" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Coooori/dialog_data_test_hf
[ "region:us" ]
2024-01-11T14:38:36+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 165142, "num_examples": 99}], "download_size": 95370, "dataset_size": 165142}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-11T14:38:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for "dialog_data_test_hf" More Information needed
[ "# Dataset Card for \"dialog_data_test_hf\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"dialog_data_test_hf\"\n\nMore Information needed" ]
0a262cb6f049386a014b34611d7a173e3e9b4237
# Dataset Card for "kmmlu_90" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
amphora/kmmlu_90
[ "region:us" ]
2024-01-11T14:48:16+00:00
{"configs": [{"config_name": "cot", "data_files": [{"split": "test", "path": "data/kmmlu-90-cot.csv"}]}, {"config_name": "direct", "data_files": [{"split": "test", "path": "data/kmmlu-90-direct.csv"}]}, {"config_name": "plain", "data_files": [{"split": "test", "path": "data/kmmlu90_plain.csv"}]}, {"config_name": "encot", "data_files": [{"split": "test", "path": "data/kmmlu90-encot.csv"}]}, {"config_name": "encot3", "data_files": [{"split": "test", "path": "data/kmmlu90-encot-3.csv"}]}]}
2024-01-17T12:28:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for "kmmlu_90" More Information needed
[ "# Dataset Card for \"kmmlu_90\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"kmmlu_90\"\n\nMore Information needed" ]
cf5197b239cf940097362a21748aad2aaec758ea
# Dataset Card for Evaluation run of AA051611/whattest <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [AA051611/whattest](https://huggingface.co/AA051611/whattest) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_AA051611__whattest", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-11T14:53:31.657383](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__whattest/blob/main/results_2024-01-11T14-53-31.657383.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7616138245668181, "acc_stderr": 0.028047923748497502, "acc_norm": 0.7655989871774836, "acc_norm_stderr": 0.02857778553593116, "mc1": 0.4149326805385557, "mc1_stderr": 0.017248314465805978, "mc2": 0.5803674233221108, "mc2_stderr": 0.014839457098843786 }, "harness|arc:challenge|25": { "acc": 0.6348122866894198, "acc_stderr": 0.014070265519268804, "acc_norm": 0.6680887372013652, "acc_norm_stderr": 0.013760988200880534 }, "harness|hellaswag|10": { "acc": 0.6463851822346146, "acc_stderr": 0.0047711430744261304, "acc_norm": 0.8442541326428998, "acc_norm_stderr": 0.003618731658837713 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7037037037037037, "acc_stderr": 0.03944624162501116, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.03944624162501116 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8947368421052632, "acc_stderr": 0.024974533450920697, "acc_norm": 0.8947368421052632, "acc_norm_stderr": 0.024974533450920697 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.81, "acc_stderr": 0.03942772444036622, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036622 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.8415094339622642, "acc_stderr": 0.02247652871016771, "acc_norm": 0.8415094339622642, "acc_norm_stderr": 0.02247652871016771 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.9027777777777778, "acc_stderr": 0.024774516250440175, "acc_norm": 0.9027777777777778, "acc_norm_stderr": 0.024774516250440175 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7456647398843931, "acc_stderr": 0.0332055644308557, "acc_norm": 0.7456647398843931, "acc_norm_stderr": 0.0332055644308557 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5784313725490197, "acc_stderr": 0.049135952012745024, "acc_norm": 0.5784313725490197, "acc_norm_stderr": 0.049135952012745024 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.81, "acc_stderr": 0.039427724440366234, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7829787234042553, "acc_stderr": 0.02694748312149622, "acc_norm": 0.7829787234042553, "acc_norm_stderr": 0.02694748312149622 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5701754385964912, "acc_stderr": 0.04657047260594964, "acc_norm": 0.5701754385964912, "acc_norm_stderr": 0.04657047260594964 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7586206896551724, "acc_stderr": 0.03565998174135302, "acc_norm": 0.7586206896551724, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.6851851851851852, "acc_stderr": 0.023919984164047732, "acc_norm": 0.6851851851851852, "acc_norm_stderr": 0.023919984164047732 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5952380952380952, "acc_stderr": 0.04390259265377563, "acc_norm": 0.5952380952380952, "acc_norm_stderr": 0.04390259265377563 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.9, "acc_stderr": 0.017066403719657255, "acc_norm": 0.9, "acc_norm_stderr": 0.017066403719657255 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6305418719211823, "acc_stderr": 0.033959703819985726, "acc_norm": 0.6305418719211823, "acc_norm_stderr": 0.033959703819985726 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.76, "acc_stderr": 0.04292346959909283, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8606060606060606, "acc_stderr": 0.027045948825865394, "acc_norm": 0.8606060606060606, "acc_norm_stderr": 0.027045948825865394 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9191919191919192, "acc_stderr": 0.019417681889724536, "acc_norm": 0.9191919191919192, "acc_norm_stderr": 0.019417681889724536 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9740932642487047, "acc_stderr": 0.01146452335695318, "acc_norm": 0.9740932642487047, "acc_norm_stderr": 0.01146452335695318 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7974358974358975, "acc_stderr": 0.020377660970371397, "acc_norm": 0.7974358974358975, "acc_norm_stderr": 0.020377660970371397 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.4111111111111111, "acc_stderr": 0.02999992350870669, "acc_norm": 0.4111111111111111, "acc_norm_stderr": 0.02999992350870669 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8487394957983193, "acc_stderr": 0.02327425589870796, "acc_norm": 0.8487394957983193, "acc_norm_stderr": 0.02327425589870796 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.4900662251655629, "acc_stderr": 0.04081677107248436, "acc_norm": 0.4900662251655629, "acc_norm_stderr": 0.04081677107248436 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9247706422018349, "acc_stderr": 0.011308662537571743, "acc_norm": 0.9247706422018349, "acc_norm_stderr": 0.011308662537571743 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6435185185185185, "acc_stderr": 0.032664783315272714, "acc_norm": 0.6435185185185185, "acc_norm_stderr": 0.032664783315272714 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9313725490196079, "acc_stderr": 0.017744453647073315, "acc_norm": 0.9313725490196079, "acc_norm_stderr": 0.017744453647073315 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.919831223628692, "acc_stderr": 0.017676679991891632, "acc_norm": 0.919831223628692, "acc_norm_stderr": 0.017676679991891632 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.8116591928251121, "acc_stderr": 0.026241132996407256, "acc_norm": 0.8116591928251121, "acc_norm_stderr": 0.026241132996407256 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.9007633587786259, "acc_stderr": 0.026222235171477364, "acc_norm": 0.9007633587786259, "acc_norm_stderr": 0.026222235171477364 }, "harness|hendrycksTest-international_law|5": { "acc": 0.9008264462809917, "acc_stderr": 0.027285246312758957, "acc_norm": 0.9008264462809917, "acc_norm_stderr": 0.027285246312758957 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8981481481481481, "acc_stderr": 0.02923927267563274, "acc_norm": 0.8981481481481481, "acc_norm_stderr": 0.02923927267563274 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8650306748466258, "acc_stderr": 0.026845765054553855, "acc_norm": 0.8650306748466258, "acc_norm_stderr": 0.026845765054553855 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5625, "acc_stderr": 0.04708567521880525, "acc_norm": 0.5625, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.912621359223301, "acc_stderr": 0.027960689125970654, "acc_norm": 0.912621359223301, "acc_norm_stderr": 0.027960689125970654 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9188034188034188, "acc_stderr": 0.01789378490401854, "acc_norm": 0.9188034188034188, "acc_norm_stderr": 0.01789378490401854 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.87, "acc_stderr": 0.03379976689896309, "acc_norm": 0.87, "acc_norm_stderr": 0.03379976689896309 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9003831417624522, "acc_stderr": 0.010709685591251671, "acc_norm": 0.9003831417624522, "acc_norm_stderr": 0.010709685591251671 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8092485549132948, "acc_stderr": 0.021152676966575277, "acc_norm": 0.8092485549132948, "acc_norm_stderr": 0.021152676966575277 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.7251396648044692, "acc_stderr": 0.014931316703220508, "acc_norm": 0.7251396648044692, "acc_norm_stderr": 0.014931316703220508 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.8366013071895425, "acc_stderr": 0.021170623011213512, "acc_norm": 0.8366013071895425, "acc_norm_stderr": 0.021170623011213512 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.819935691318328, "acc_stderr": 0.02182342285774494, "acc_norm": 0.819935691318328, "acc_norm_stderr": 0.02182342285774494 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8888888888888888, "acc_stderr": 0.017486432785880704, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.017486432785880704 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6276595744680851, "acc_stderr": 0.02883892147125145, "acc_norm": 0.6276595744680851, "acc_norm_stderr": 0.02883892147125145 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.6140808344198174, "acc_stderr": 0.012433398911476138, "acc_norm": 0.6140808344198174, "acc_norm_stderr": 0.012433398911476138 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8786764705882353, "acc_stderr": 0.01983363748105792, "acc_norm": 0.8786764705882353, "acc_norm_stderr": 0.01983363748105792 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.815359477124183, "acc_stderr": 0.015697029240757776, "acc_norm": 0.815359477124183, "acc_norm_stderr": 0.015697029240757776 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7181818181818181, "acc_stderr": 0.043091187099464585, "acc_norm": 0.7181818181818181, "acc_norm_stderr": 0.043091187099464585 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8408163265306122, "acc_stderr": 0.023420972069166338, "acc_norm": 0.8408163265306122, "acc_norm_stderr": 0.023420972069166338 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8955223880597015, "acc_stderr": 0.021628920516700643, "acc_norm": 0.8955223880597015, "acc_norm_stderr": 0.021628920516700643 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.92, "acc_stderr": 0.0272659924344291, "acc_norm": 0.92, "acc_norm_stderr": 0.0272659924344291 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685516, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685516 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.9005847953216374, "acc_stderr": 0.022949025579355027, "acc_norm": 0.9005847953216374, "acc_norm_stderr": 0.022949025579355027 }, "harness|truthfulqa:mc|0": { "mc1": 0.4149326805385557, "mc1_stderr": 0.017248314465805978, "mc2": 0.5803674233221108, "mc2_stderr": 0.014839457098843786 }, "harness|winogrande|5": { "acc": 0.824782951854775, "acc_stderr": 0.010684179227706179 }, "harness|gsm8k|5": { "acc": 0.6944655041698257, "acc_stderr": 0.012688134076726882 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_AA051611__whattest
[ "region:us" ]
2024-01-11T14:55:42+00:00
{"pretty_name": "Evaluation run of AA051611/whattest", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051611/whattest](https://huggingface.co/AA051611/whattest) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051611__whattest\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-11T14:53:31.657383](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__whattest/blob/main/results_2024-01-11T14-53-31.657383.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7616138245668181,\n \"acc_stderr\": 0.028047923748497502,\n \"acc_norm\": 0.7655989871774836,\n \"acc_norm_stderr\": 0.02857778553593116,\n \"mc1\": 0.4149326805385557,\n \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5803674233221108,\n \"mc2_stderr\": 0.014839457098843786\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6348122866894198,\n \"acc_stderr\": 0.014070265519268804,\n \"acc_norm\": 0.6680887372013652,\n \"acc_norm_stderr\": 0.013760988200880534\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6463851822346146,\n \"acc_stderr\": 0.0047711430744261304,\n \"acc_norm\": 0.8442541326428998,\n \"acc_norm_stderr\": 0.003618731658837713\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.024974533450920697,\n \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.024974533450920697\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8415094339622642,\n \"acc_stderr\": 0.02247652871016771,\n \"acc_norm\": 0.8415094339622642,\n \"acc_norm_stderr\": 0.02247652871016771\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9027777777777778,\n \"acc_stderr\": 0.024774516250440175,\n \"acc_norm\": 0.9027777777777778,\n \"acc_norm_stderr\": 0.024774516250440175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.049135952012745024,\n \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.049135952012745024\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7829787234042553,\n \"acc_stderr\": 0.02694748312149622,\n \"acc_norm\": 0.7829787234042553,\n \"acc_norm_stderr\": 0.02694748312149622\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5701754385964912,\n \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.5701754385964912,\n \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7586206896551724,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.7586206896551724,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.023919984164047732,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.023919984164047732\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5952380952380952,\n \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.5952380952380952,\n \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.017066403719657255,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.017066403719657255\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6305418719211823,\n \"acc_stderr\": 0.033959703819985726,\n \"acc_norm\": 0.6305418719211823,\n \"acc_norm_stderr\": 0.033959703819985726\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536,\n \"acc_norm\": 0.9191919191919192,\n \"acc_norm_stderr\": 0.019417681889724536\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7974358974358975,\n \"acc_stderr\": 0.020377660970371397,\n \"acc_norm\": 0.7974358974358975,\n \"acc_norm_stderr\": 0.020377660970371397\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4111111111111111,\n \"acc_stderr\": 0.02999992350870669,\n \"acc_norm\": 0.4111111111111111,\n \"acc_norm_stderr\": 0.02999992350870669\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.02327425589870796,\n \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.02327425589870796\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9247706422018349,\n \"acc_stderr\": 0.011308662537571743,\n \"acc_norm\": 0.9247706422018349,\n \"acc_norm_stderr\": 0.011308662537571743\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6435185185185185,\n \"acc_stderr\": 0.032664783315272714,\n \"acc_norm\": 0.6435185185185185,\n \"acc_norm_stderr\": 0.032664783315272714\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073315,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073315\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.919831223628692,\n \"acc_stderr\": 0.017676679991891632,\n \"acc_norm\": 0.919831223628692,\n \"acc_norm_stderr\": 0.017676679991891632\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n \"acc_stderr\": 0.026241132996407256,\n \"acc_norm\": 0.8116591928251121,\n \"acc_norm_stderr\": 0.026241132996407256\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.9007633587786259,\n \"acc_stderr\": 0.026222235171477364,\n \"acc_norm\": 0.9007633587786259,\n \"acc_norm_stderr\": 0.026222235171477364\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.02923927267563274,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.02923927267563274\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553855,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553855\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.912621359223301,\n \"acc_stderr\": 0.027960689125970654,\n \"acc_norm\": 0.912621359223301,\n \"acc_norm_stderr\": 0.027960689125970654\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n \"acc_stderr\": 0.01789378490401854,\n \"acc_norm\": 0.9188034188034188,\n \"acc_norm_stderr\": 0.01789378490401854\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9003831417624522,\n \"acc_stderr\": 0.010709685591251671,\n \"acc_norm\": 0.9003831417624522,\n \"acc_norm_stderr\": 0.010709685591251671\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.021152676966575277,\n \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.021152676966575277\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7251396648044692,\n \"acc_stderr\": 0.014931316703220508,\n \"acc_norm\": 0.7251396648044692,\n \"acc_norm_stderr\": 0.014931316703220508\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8366013071895425,\n \"acc_stderr\": 0.021170623011213512,\n \"acc_norm\": 0.8366013071895425,\n \"acc_norm_stderr\": 0.021170623011213512\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.819935691318328,\n \"acc_stderr\": 0.02182342285774494,\n \"acc_norm\": 0.819935691318328,\n \"acc_norm_stderr\": 0.02182342285774494\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.017486432785880704,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.017486432785880704\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6276595744680851,\n \"acc_stderr\": 0.02883892147125145,\n \"acc_norm\": 0.6276595744680851,\n \"acc_norm_stderr\": 0.02883892147125145\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6140808344198174,\n \"acc_stderr\": 0.012433398911476138,\n \"acc_norm\": 0.6140808344198174,\n \"acc_norm_stderr\": 0.012433398911476138\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8786764705882353,\n \"acc_stderr\": 0.01983363748105792,\n \"acc_norm\": 0.8786764705882353,\n \"acc_norm_stderr\": 0.01983363748105792\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.815359477124183,\n \"acc_stderr\": 0.015697029240757776,\n \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.015697029240757776\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8408163265306122,\n \"acc_stderr\": 0.023420972069166338,\n \"acc_norm\": 0.8408163265306122,\n \"acc_norm_stderr\": 0.023420972069166338\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.9005847953216374,\n \"acc_stderr\": 0.022949025579355027,\n \"acc_norm\": 0.9005847953216374,\n \"acc_norm_stderr\": 0.022949025579355027\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4149326805385557,\n \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5803674233221108,\n \"mc2_stderr\": 0.014839457098843786\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.824782951854775,\n \"acc_stderr\": 0.010684179227706179\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6944655041698257,\n \"acc_stderr\": 0.012688134076726882\n }\n}\n```", "repo_url": "https://huggingface.co/AA051611/whattest", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|arc:challenge|25_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|gsm8k|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hellaswag|10_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-11T14-53-31.657383.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["**/details_harness|winogrande|5_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-11T14-53-31.657383.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_11T14_53_31.657383", "path": ["results_2024-01-11T14-53-31.657383.parquet"]}, {"split": "latest", "path": ["results_2024-01-11T14-53-31.657383.parquet"]}]}]}
2024-01-11T14:56:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of AA051611/whattest Dataset automatically created during the evaluation run of model AA051611/whattest on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-11T14:53:31.657383(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of AA051611/whattest\n\n\n\nDataset automatically created during the evaluation run of model AA051611/whattest on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T14:53:31.657383(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of AA051611/whattest\n\n\n\nDataset automatically created during the evaluation run of model AA051611/whattest on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-11T14:53:31.657383(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
a489900f63846ed4b3e58d1c290a90d1ecb7af8a
# Astrophysics french QA The "Astrophysics french QA" dataset is an innovative collection combining scraped articles from the web with ChatGPT-generated question and answer pairs, offering a unique blend of information and interactive learning in the field of astrophysics. It contains almost 5k prompt / response generated by ChatGPT. It can be used to train / finetune / evaluate LLMs on astro subjects.
guigux/astro_qa_fr_0.1
[ "task_categories:text-generation", "size_categories:1K<n<10K", "language:fr", "astrophysics", "region:us" ]
2024-01-11T15:13:54+00:00
{"language": ["fr"], "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"], "pretty_name": "Astrophysics french QA", "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1303701, "num_examples": 4906}], "download_size": 692034, "dataset_size": 1303701}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["astrophysics"]}
2024-01-11T15:34:42+00:00
[]
[ "fr" ]
TAGS #task_categories-text-generation #size_categories-1K<n<10K #language-French #astrophysics #region-us
# Astrophysics french QA The "Astrophysics french QA" dataset is an innovative collection combining scraped articles from the web with ChatGPT-generated question and answer pairs, offering a unique blend of information and interactive learning in the field of astrophysics. It contains almost 5k prompt / response generated by ChatGPT. It can be used to train / finetune / evaluate LLMs on astro subjects.
[ "# Astrophysics french QA\n\nThe \"Astrophysics french QA\" dataset is an innovative collection combining scraped articles from the web with ChatGPT-generated question and answer pairs, offering a unique blend of information and interactive learning in the field of astrophysics. It contains almost 5k prompt / response generated by ChatGPT. It can be used to train / finetune / evaluate LLMs on astro subjects." ]
[ "TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-French #astrophysics #region-us \n", "# Astrophysics french QA\n\nThe \"Astrophysics french QA\" dataset is an innovative collection combining scraped articles from the web with ChatGPT-generated question and answer pairs, offering a unique blend of information and interactive learning in the field of astrophysics. It contains almost 5k prompt / response generated by ChatGPT. It can be used to train / finetune / evaluate LLMs on astro subjects." ]
a6022e0734f9da8436031b910525f467ffcf1e3b
# Multilingual Embeddings for Wikipedia in 300+ Languages This dataset contains the [wikimedia/wikipedia](https://huggingface.co/datasets/wikimedia/wikipedia) dataset dump from 2023-11-01 from Wikipedia in all 300+ languages. The individual articles have been chunked and embedded with the state-of-the-art multilingual [Cohere Embed V3 embedding model](https://txt.cohere.com/introducing-embed-v3/). This enables an easy way to semantically search across all of Wikipedia or to use it as a knowledge source for your RAG application. In total is it close to 250M paragraphs / embeddings. You can also use the model to perform cross-lingual search: Enter your search query in any language and get the most relevant results back. ## Loading the dataset ### Loading the document embeddings The `corpus` split contains all document embeddings of the corpus. You can either load the dataset like this: ```python from datasets import load_dataset lang = "simple" #Use the Simple English Wikipedia subset docs = load_dataset("Cohere/wikipedia-2023-11-embed-multilingual-v3", lang, split="train") ``` Or you can also stream it without downloading it before: ```python from datasets import load_dataset lang = "simple" #Use the Simple English Wikipedia subset docs = load_dataset("Cohere/wikipedia-2023-11-embed-multilingual-v3", lang, split="train", streaming=True) for doc in docs: doc_id = doc['_id'] title = doc['title'] text = doc['text'] emb = doc['emb'] ``` Note, depending on the language, the download can be quite large. ## Search A full search example (on the first 1,000 paragraphs): ```python #Run: pip install cohere datasets numpy from datasets import load_dataset import numpy as np import cohere import os co = cohere.Client("YOUR_COHERE_API_KEY") # Add your cohere API key from www.cohere.com lang = "simple" top_k = 3 #Load at max 1000 chunks + embeddings max_docs = 1000 docs_stream = load_dataset(f"Cohere/wikipedia-2023-11-embed-multilingual-v3", lang, split="train", streaming=True) docs = [] doc_embeddings = [] for doc in docs_stream: docs.append(doc) doc_embeddings.append(doc['emb']) if len(docs) >= max_docs: break doc_embeddings = np.asarray(doc_embeddings) query = 'Who was Alan Turing' response = co.embed(texts=[query], model='embed-multilingual-v3.0', input_type="search_query") query_embedding = response.embeddings query_embedding = np.asarray(query_embedding) # Compute dot score between query embedding and document embeddings dot_scores = np.matmul(query_embedding, doc_embeddings.transpose())[0] top_k_hits = np.argpartition(dot_scores, -top_k)[-top_k:] # Print results print("Query:", query) for doc_id in top_k_hits: print(docs[doc_id]['title']) print(docs[doc_id]['text']) print(docs[doc_id]['url'], "\n") ``` ## Overview The following table contains all language codes together with the total numbers of passages. | Language | #Docs | |---|:---:| | en | 41,488,110 | | de | 20,772,081 | | fr | 17,813,768 | | ru | 13,734,543 | | es | 12,905,284 | | it | 10,462,162 | | ceb | 9,818,657 | | uk | 6,901,192 | | ja | 6,626,537 | | nl | 6,101,353 | | pl | 5,973,650 | | pt | 5,637,930 | | sv | 4,911,480 | | ca | 4,156,889 | | ar | 3,691,451 | | cs | 3,118,524 | | he | 2,948,882 | | hu | 2,924,609 | | vi | 2,835,049 | | zh | 2,775,260 | | fi | 2,427,097 | | id | 2,358,335 | | no | 2,211,270 | | sr | 2,154,631 | | fa | 2,073,154 | | tr | 1,775,036 | | ro | 1,770,527 | | el | 1,599,770 | | ko | 1,513,291 | | bg | 1,455,765 | | hy | 1,386,140 | | eu | 1,327,579 | | da | 1,224,982 | | eo | 1,216,706 | | war | 1,185,097 | | sh | 1,139,309 | | tt | 1,119,546 | | arz | 1,089,164 | | gl | 1,056,990 | | et | 1,054,770 | | ce | 1,013,217 | | ast | 1,010,445 | | sl | 984,855 | | hr | 910,923 | | sk | 874,014 | | ms | 869,579 | | be | 857,033 | | th | 839,712 | | az | 817,631 | | uz | 811,028 | | mk | 784,576 | | lt | 770,069 | | bn | 767,965 | | cy | 762,338 | | ta | 685,975 | | simple | 646,424 | | te | 634,778 | | kk | 627,085 | | ka | 595,401 | | hi | 541,822 | | nn | 530,590 | | lv | 484,957 | | af | 461,674 | | ba | 434,939 | | ur | 434,269 | | bs | 396,692 | | sq | 388,788 | | ml | 384,795 | | min | 373,156 | | la | 340,521 | | pnb | 335,958 | | be-x-old | 314,600 | | kn | 309,085 | | azb | 294,248 | | oc | 283,297 | | zh-min-nan | 278,547 | | fy | 248,075 | | my | 241,025 | | lb | 216,558 | | ky | 216,344 | | als | 206,387 | | mr | 203,479 | | br | 200,486 | | pa | 188,091 | | is | 177,272 | | mg | 171,947 | | sw | 171,650 | | ha | 167,807 | | tl | 166,907 | | nds | 166,019 | | an | 143,163 | | jv | 142,104 | | ps | 138,240 | | ig | 132,250 | | new | 128,696 | | tg | 128,237 | | ga | 125,456 | | lld | 125,094 | | su | 124,390 | | cv | 122,671 | | ckb | 120,886 | | si | 119,223 | | mn | 114,878 | | lmo | 103,836 | | io | 101,713 | | gu | 99,450 | | vec | 95,072 | | zh-yue | 89,145 | | bar | 88,238 | | sco | 83,906 | | ne | 83,598 | | ku | 82,935 | | hyw | 82,343 | | pms | 77,834 | | as | 76,093 | | km | 74,177 | | sah | 71,599 | | li | 69,267 | | or | 65,510 | | mt | 64,038 | | szl | 56,836 | | yi | 55,375 | | ht | 55,079 | | dag | 53,343 | | sa | 51,735 | | nv | 49,355 | | bpy | 47,757 | | vo | 47,375 | | ug | 44,764 | | sat | 43,500 | | ia | 42,012 | | bo | 41,438 | | mwl | 41,273 | | sd | 40,395 | | bcl | 39,967 | | mnw | 39,578 | | hsb | 39,560 | | avk | 39,001 | | scn | 38,359 | | rm | 37,436 | | diq | 34,743 | | vep | 33,654 | | xmf | 33,238 | | ban | 32,320 | | wa | 32,132 | | ilo | 31,046 | | nds-nl | 30,918 | | qu | 30,529 | | so | 29,936 | | mhr | 29,619 | | vls | 29,227 | | sc | 28,977 | | fo | 28,809 | | gd | 28,149 | | rw | 28,037 | | gom | 27,792 | | yo | 27,789 | | tum | 26,743 | | wuu | 26,532 | | frr | 26,010 | | sn | 25,941 | | tk | 24,269 | | blk | 24,194 | | mzn | 23,837 | | co | 23,065 | | szy | 22,854 | | am | 22,467 | | shn | 22,432 | | skr | 21,081 | | lfn | 20,781 | | tyv | 20,762 | | lij | 20,553 | | ie | 19,994 | | rue | 19,916 | | crh | 19,016 | | gor | 18,146 | | ary | 17,463 | | dv | 16,941 | | lg | 16,751 | | roa-tara | 16,572 | | bjn | 16,429 | | tw | 16,304 | | bh | 15,938 | | pam | 15,134 | | os | 15,096 | | myv | 15,062 | | gn | 14,983 | | lez | 14,152 | | mai | 13,806 | | kv | 13,534 | | pcd | 13,057 | | zh-classical | 12,791 | | zea | 12,528 | | lo | 12,525 | | gv | 12,074 | | stq | 11,890 | | zu | 11,680 | | smn | 11,672 | | kw | 11,539 | | bat-smg | 11,240 | | hif | 11,215 | | ext | 10,967 | | ace | 10,821 | | trv | 10,546 | | ami | 10,538 | | tcy | 10,531 | | lad | 10,386 | | alt | 10,256 | | pap | 10,187 | | kab | 10,179 | | fur | 10,148 | | nap | 10,079 | | mrj | 9,771 | | kaa | 9,548 | | nqo | 9,153 | | glk | 9,120 | | pfl | 8,790 | | fiu-vro | 8,757 | | nso | 8,635 | | jbo | 8,577 | | bxr | 8,549 | | wo | 8,549 | | olo | 8,530 | | map-bms | 8,393 | | ksh | 8,226 | | csb | 8,085 | | av | 7,873 | | mni | 7,740 | | udm | 7,730 | | mi | 7,643 | | kbp | 7,616 | | dsb | 7,536 | | frp | 7,294 | | om | 7,045 | | ang | 7,023 | | hak | 6,866 | | gur | 6,761 | | se | 6,733 | | anp | 6,704 | | tay | 6,434 | | mdf | 6,351 | | gcr | 6,347 | | koi | 6,300 | | krc | 6,293 | | ay | 5,985 | | cdo | 5,917 | | nrm | 5,786 | | xh | 5,756 | | tn | 5,712 | | tly | 5,598 | | shi | 5,179 | | pcm | 5,076 | | fat | 4,968 | | nia | 4,795 | | dty | 4,728 | | kbd | 4,667 | | gpe | 4,289 | | cbk-zam | 4,224 | | ff | 4,166 | | dz | 4,117 | | guw | 3,982 | | eml | 3,979 | | ln | 3,774 | | inh | 3,768 | | nah | 3,720 | | ab | 3,465 | | ks | 3,255 | | mad | 3,236 | | haw | 3,227 | | gag | 3,076 | | tet | 3,030 | | ny | 2,933 | | pag | 2,727 | | guc | 2,454 | | roa-rup | 2,409 | | jam | 2,387 | | awa | 2,242 | | pdc | 2,239 | | to | 2,165 | | za | 2,132 | | st | 2,051 | | ltg | 2,005 | | atj | 1,967 | | nov | 1,916 | | ss | 1,904 | | pwn | 1,881 | | ee | 1,819 | | sm | 1,659 | | ts | 1,645 | | gan | 1,626 | | xal | 1,619 | | kcg | 1,555 | | cu | 1,477 | | srn | 1,395 | | got | 1,280 | | fon | 1,247 | | din | 1,214 | | arc | 1,167 | | fj | 1,164 | | rmy | 1,113 | | ady | 1,040 | | rn | 1,033 | | bm | 1,017 | | tpi | 957 | | ve | 919 | | ki | 798 | | pnt | 796 | | chr | 788 | | kl | 770 | | lbe | 766 | | bi | 718 | | ti | 706 | | kg | 609 | | pih | 606 | | ch | 513 | | bug | 429 | | ty | 297 | | ik | 275 | | iu | 263 | | pi | 260 | | sg | 204 | | chy | 57 | | cr | 41 | | Total | 247,154,006 |
Cohere/wikipedia-2023-11-embed-multilingual-v3
[ "region:us" ]
2024-01-11T16:28:41+00:00
{"configs": [{"config_name": "ab", "data_files": [{"split": "train", "path": "ab/*"}]}, {"config_name": "ace", "data_files": [{"split": "train", "path": "ace/*"}]}, {"config_name": "ady", "data_files": [{"split": "train", "path": "ady/*"}]}, {"config_name": "af", "data_files": [{"split": "train", "path": "af/*"}]}, {"config_name": "als", "data_files": [{"split": "train", "path": "als/*"}]}, {"config_name": "alt", "data_files": [{"split": "train", "path": "alt/*"}]}, {"config_name": "am", "data_files": [{"split": "train", "path": "am/*"}]}, {"config_name": "ami", "data_files": [{"split": "train", "path": "ami/*"}]}, {"config_name": "an", "data_files": [{"split": "train", "path": "an/*"}]}, {"config_name": "ang", "data_files": [{"split": "train", "path": "ang/*"}]}, {"config_name": "anp", "data_files": [{"split": "train", "path": "anp/*"}]}, {"config_name": "ar", "data_files": [{"split": "train", "path": "ar/*"}]}, {"config_name": "arc", "data_files": [{"split": "train", "path": "arc/*"}]}, {"config_name": "ary", "data_files": [{"split": "train", "path": "ary/*"}]}, {"config_name": "arz", "data_files": [{"split": "train", "path": "arz/*"}]}, {"config_name": "as", "data_files": [{"split": "train", "path": "as/*"}]}, {"config_name": "ast", "data_files": [{"split": "train", "path": "ast/*"}]}, {"config_name": "atj", "data_files": [{"split": "train", "path": "atj/*"}]}, {"config_name": "av", "data_files": [{"split": "train", "path": "av/*"}]}, {"config_name": "avk", "data_files": [{"split": "train", "path": "avk/*"}]}, {"config_name": "awa", "data_files": [{"split": "train", "path": "awa/*"}]}, {"config_name": "ay", "data_files": [{"split": "train", "path": "ay/*"}]}, {"config_name": "az", "data_files": [{"split": "train", "path": "az/*"}]}, {"config_name": "azb", "data_files": [{"split": "train", "path": "azb/*"}]}, {"config_name": "ba", "data_files": [{"split": "train", "path": "ba/*"}]}, {"config_name": "ban", "data_files": [{"split": "train", "path": "ban/*"}]}, {"config_name": "bar", "data_files": [{"split": "train", "path": "bar/*"}]}, {"config_name": "bat-smg", "data_files": [{"split": "train", "path": "bat-smg/*"}]}, {"config_name": "bcl", "data_files": [{"split": "train", "path": "bcl/*"}]}, {"config_name": "be", "data_files": [{"split": "train", "path": "be/*"}]}, {"config_name": "be-x-old", "data_files": [{"split": "train", "path": "be-x-old/*"}]}, {"config_name": "bg", "data_files": [{"split": "train", "path": "bg/*"}]}, {"config_name": "bh", "data_files": [{"split": "train", "path": "bh/*"}]}, {"config_name": "bi", "data_files": [{"split": "train", "path": "bi/*"}]}, {"config_name": "bjn", "data_files": [{"split": "train", "path": "bjn/*"}]}, {"config_name": "blk", "data_files": [{"split": "train", "path": "blk/*"}]}, {"config_name": "bm", "data_files": [{"split": "train", "path": "bm/*"}]}, {"config_name": "bn", "data_files": [{"split": "train", "path": "bn/*"}]}, {"config_name": "bo", "data_files": [{"split": "train", "path": "bo/*"}]}, {"config_name": "bpy", "data_files": [{"split": "train", "path": "bpy/*"}]}, {"config_name": "br", "data_files": [{"split": "train", "path": "br/*"}]}, {"config_name": "bs", "data_files": [{"split": "train", "path": "bs/*"}]}, {"config_name": "bug", "data_files": [{"split": "train", "path": "bug/*"}]}, {"config_name": "bxr", "data_files": [{"split": "train", "path": "bxr/*"}]}, {"config_name": "ca", "data_files": [{"split": "train", "path": "ca/*"}]}, {"config_name": "cbk-zam", "data_files": [{"split": "train", "path": "cbk-zam/*"}]}, {"config_name": "cdo", "data_files": [{"split": "train", "path": "cdo/*"}]}, {"config_name": "ce", "data_files": [{"split": "train", "path": "ce/*"}]}, {"config_name": "ceb", "data_files": [{"split": "train", "path": "ceb/*"}]}, {"config_name": "ch", "data_files": [{"split": "train", "path": "ch/*"}]}, {"config_name": "chr", "data_files": [{"split": "train", "path": "chr/*"}]}, {"config_name": "chy", "data_files": [{"split": "train", "path": "chy/*"}]}, {"config_name": "ckb", "data_files": [{"split": "train", "path": "ckb/*"}]}, {"config_name": "co", "data_files": [{"split": "train", "path": "co/*"}]}, {"config_name": "cr", "data_files": [{"split": "train", "path": "cr/*"}]}, {"config_name": "crh", "data_files": [{"split": "train", "path": "crh/*"}]}, {"config_name": "cs", "data_files": [{"split": "train", "path": "cs/*"}]}, {"config_name": "csb", "data_files": [{"split": "train", "path": "csb/*"}]}, {"config_name": "cu", "data_files": [{"split": "train", "path": "cu/*"}]}, {"config_name": "cv", "data_files": [{"split": "train", "path": "cv/*"}]}, {"config_name": "cy", "data_files": [{"split": "train", "path": "cy/*"}]}, {"config_name": "da", "data_files": [{"split": "train", "path": "da/*"}]}, {"config_name": "dag", "data_files": [{"split": "train", "path": "dag/*"}]}, {"config_name": "de", "data_files": [{"split": "train", "path": "de/*"}]}, {"config_name": "din", "data_files": [{"split": "train", "path": "din/*"}]}, {"config_name": "diq", "data_files": [{"split": "train", "path": "diq/*"}]}, {"config_name": "dsb", "data_files": [{"split": "train", "path": "dsb/*"}]}, {"config_name": "dty", "data_files": [{"split": "train", "path": "dty/*"}]}, {"config_name": "dv", "data_files": [{"split": "train", "path": "dv/*"}]}, {"config_name": "dz", "data_files": [{"split": "train", "path": "dz/*"}]}, {"config_name": "ee", "data_files": [{"split": "train", "path": "ee/*"}]}, {"config_name": "el", "data_files": [{"split": "train", "path": "el/*"}]}, {"config_name": "eml", "data_files": [{"split": "train", "path": "eml/*"}]}, {"config_name": "en", "data_files": [{"split": "train", "path": "en/*"}]}, {"config_name": "eo", "data_files": [{"split": "train", "path": "eo/*"}]}, {"config_name": "es", "data_files": [{"split": "train", "path": "es/*"}]}, {"config_name": "et", "data_files": [{"split": "train", "path": "et/*"}]}, {"config_name": "eu", "data_files": [{"split": "train", "path": "eu/*"}]}, {"config_name": "ext", "data_files": [{"split": "train", "path": "ext/*"}]}, {"config_name": "fa", "data_files": [{"split": "train", "path": "fa/*"}]}, {"config_name": "fat", "data_files": [{"split": "train", "path": "fat/*"}]}, {"config_name": "ff", "data_files": [{"split": "train", "path": "ff/*"}]}, {"config_name": "fi", "data_files": [{"split": "train", "path": "fi/*"}]}, {"config_name": "fiu-vro", "data_files": [{"split": "train", "path": "fiu-vro/*"}]}, {"config_name": "fj", "data_files": [{"split": "train", "path": "fj/*"}]}, {"config_name": "fo", "data_files": [{"split": "train", "path": "fo/*"}]}, {"config_name": "fon", "data_files": [{"split": "train", "path": "fon/*"}]}, {"config_name": "fr", "data_files": [{"split": "train", "path": "fr/*"}]}, {"config_name": "frp", "data_files": [{"split": "train", "path": "frp/*"}]}, {"config_name": "frr", "data_files": [{"split": "train", "path": "frr/*"}]}, {"config_name": "fur", "data_files": [{"split": "train", "path": "fur/*"}]}, {"config_name": "fy", "data_files": [{"split": "train", "path": "fy/*"}]}, {"config_name": "ga", "data_files": [{"split": "train", "path": "ga/*"}]}, {"config_name": "gag", "data_files": [{"split": "train", "path": "gag/*"}]}, {"config_name": "gan", "data_files": [{"split": "train", "path": "gan/*"}]}, {"config_name": "gcr", "data_files": [{"split": "train", "path": "gcr/*"}]}, {"config_name": "gd", "data_files": [{"split": "train", "path": "gd/*"}]}, {"config_name": "gl", "data_files": [{"split": "train", "path": "gl/*"}]}, {"config_name": "glk", "data_files": [{"split": "train", "path": "glk/*"}]}, {"config_name": "gn", "data_files": [{"split": "train", "path": "gn/*"}]}, {"config_name": "gom", "data_files": [{"split": "train", "path": "gom/*"}]}, {"config_name": "gor", "data_files": [{"split": "train", "path": "gor/*"}]}, {"config_name": "got", "data_files": [{"split": "train", "path": "got/*"}]}, {"config_name": "gpe", "data_files": [{"split": "train", "path": "gpe/*"}]}, {"config_name": "gu", "data_files": [{"split": "train", "path": "gu/*"}]}, {"config_name": "guc", "data_files": [{"split": "train", "path": "guc/*"}]}, {"config_name": "gur", "data_files": [{"split": "train", "path": "gur/*"}]}, {"config_name": "guw", "data_files": [{"split": "train", "path": "guw/*"}]}, {"config_name": "gv", "data_files": [{"split": "train", "path": "gv/*"}]}, {"config_name": "ha", "data_files": [{"split": "train", "path": "ha/*"}]}, {"config_name": "hak", "data_files": [{"split": "train", "path": "hak/*"}]}, {"config_name": "haw", "data_files": [{"split": "train", "path": "haw/*"}]}, {"config_name": "he", "data_files": [{"split": "train", "path": "he/*"}]}, {"config_name": "hi", "data_files": [{"split": "train", "path": "hi/*"}]}, {"config_name": "hif", "data_files": [{"split": "train", "path": "hif/*"}]}, {"config_name": "hr", "data_files": [{"split": "train", "path": "hr/*"}]}, {"config_name": "hsb", "data_files": [{"split": "train", "path": "hsb/*"}]}, {"config_name": "ht", "data_files": [{"split": "train", "path": "ht/*"}]}, {"config_name": "hu", "data_files": [{"split": "train", "path": "hu/*"}]}, {"config_name": "hy", "data_files": [{"split": "train", "path": "hy/*"}]}, {"config_name": "hyw", "data_files": [{"split": "train", "path": "hyw/*"}]}, {"config_name": "ia", "data_files": [{"split": "train", "path": "ia/*"}]}, {"config_name": "id", "data_files": [{"split": "train", "path": "id/*"}]}, {"config_name": "ie", "data_files": [{"split": "train", "path": "ie/*"}]}, {"config_name": "ig", "data_files": [{"split": "train", "path": "ig/*"}]}, {"config_name": "ik", "data_files": [{"split": "train", "path": "ik/*"}]}, {"config_name": "ilo", "data_files": [{"split": "train", "path": "ilo/*"}]}, {"config_name": "inh", "data_files": [{"split": "train", "path": "inh/*"}]}, {"config_name": "io", "data_files": [{"split": "train", "path": "io/*"}]}, {"config_name": "is", "data_files": [{"split": "train", "path": "is/*"}]}, {"config_name": "it", "data_files": [{"split": "train", "path": "it/*"}]}, {"config_name": "iu", "data_files": [{"split": "train", "path": "iu/*"}]}, {"config_name": "ja", "data_files": [{"split": "train", "path": "ja/*"}]}, {"config_name": "jam", "data_files": [{"split": "train", "path": "jam/*"}]}, {"config_name": "jbo", "data_files": [{"split": "train", "path": "jbo/*"}]}, {"config_name": "jv", "data_files": [{"split": "train", "path": "jv/*"}]}, {"config_name": "ka", "data_files": [{"split": "train", "path": "ka/*"}]}, {"config_name": "kaa", "data_files": [{"split": "train", "path": "kaa/*"}]}, {"config_name": "kab", "data_files": [{"split": "train", "path": "kab/*"}]}, {"config_name": "kbd", "data_files": [{"split": "train", "path": "kbd/*"}]}, {"config_name": "kbp", "data_files": [{"split": "train", "path": "kbp/*"}]}, {"config_name": "kcg", "data_files": [{"split": "train", "path": "kcg/*"}]}, {"config_name": "kg", "data_files": [{"split": "train", "path": "kg/*"}]}, {"config_name": "ki", "data_files": [{"split": "train", "path": "ki/*"}]}, {"config_name": "kk", "data_files": [{"split": "train", "path": "kk/*"}]}, {"config_name": "kl", "data_files": [{"split": "train", "path": "kl/*"}]}, {"config_name": "km", "data_files": [{"split": "train", "path": "km/*"}]}, {"config_name": "kn", "data_files": [{"split": "train", "path": "kn/*"}]}, {"config_name": "ko", "data_files": [{"split": "train", "path": "ko/*"}]}, {"config_name": "koi", "data_files": [{"split": "train", "path": "koi/*"}]}, {"config_name": "krc", "data_files": [{"split": "train", "path": "krc/*"}]}, {"config_name": "ks", "data_files": [{"split": "train", "path": "ks/*"}]}, {"config_name": "ksh", "data_files": [{"split": "train", "path": "ksh/*"}]}, {"config_name": "ku", "data_files": [{"split": "train", "path": "ku/*"}]}, {"config_name": "kv", "data_files": [{"split": "train", "path": "kv/*"}]}, {"config_name": "kw", "data_files": [{"split": "train", "path": "kw/*"}]}, {"config_name": "ky", "data_files": [{"split": "train", "path": "ky/*"}]}, {"config_name": "la", "data_files": [{"split": "train", "path": "la/*"}]}, {"config_name": "lad", "data_files": [{"split": "train", "path": "lad/*"}]}, {"config_name": "lb", "data_files": [{"split": "train", "path": "lb/*"}]}, {"config_name": "lbe", "data_files": [{"split": "train", "path": "lbe/*"}]}, {"config_name": "lez", "data_files": [{"split": "train", "path": "lez/*"}]}, {"config_name": "lfn", "data_files": [{"split": "train", "path": "lfn/*"}]}, {"config_name": "lg", "data_files": [{"split": "train", "path": "lg/*"}]}, {"config_name": "li", "data_files": [{"split": "train", "path": "li/*"}]}, {"config_name": "lij", "data_files": [{"split": "train", "path": "lij/*"}]}, {"config_name": "lld", "data_files": [{"split": "train", "path": "lld/*"}]}, {"config_name": "lmo", "data_files": [{"split": "train", "path": "lmo/*"}]}, {"config_name": "ln", "data_files": [{"split": "train", "path": "ln/*"}]}, {"config_name": "lo", "data_files": [{"split": "train", "path": "lo/*"}]}, {"config_name": "lt", "data_files": [{"split": "train", "path": "lt/*"}]}, {"config_name": "ltg", "data_files": [{"split": "train", "path": "ltg/*"}]}, {"config_name": "lv", "data_files": [{"split": "train", "path": "lv/*"}]}, {"config_name": "mad", "data_files": [{"split": "train", "path": "mad/*"}]}, {"config_name": "mai", "data_files": [{"split": "train", "path": "mai/*"}]}, {"config_name": "map-bms", "data_files": [{"split": "train", "path": "map-bms/*"}]}, {"config_name": "mdf", "data_files": [{"split": "train", "path": "mdf/*"}]}, {"config_name": "mg", "data_files": [{"split": "train", "path": "mg/*"}]}, {"config_name": "mhr", "data_files": [{"split": "train", "path": "mhr/*"}]}, {"config_name": "mi", "data_files": [{"split": "train", "path": "mi/*"}]}, {"config_name": "min", "data_files": [{"split": "train", "path": "min/*"}]}, {"config_name": "mk", "data_files": [{"split": "train", "path": "mk/*"}]}, {"config_name": "ml", "data_files": [{"split": "train", "path": "ml/*"}]}, {"config_name": "mn", "data_files": [{"split": "train", "path": "mn/*"}]}, {"config_name": "mni", "data_files": [{"split": "train", "path": "mni/*"}]}, {"config_name": "mnw", "data_files": [{"split": "train", "path": "mnw/*"}]}, {"config_name": "mr", "data_files": [{"split": "train", "path": "mr/*"}]}, {"config_name": "mrj", "data_files": [{"split": "train", "path": "mrj/*"}]}, {"config_name": "ms", "data_files": [{"split": "train", "path": "ms/*"}]}, {"config_name": "mt", "data_files": [{"split": "train", "path": "mt/*"}]}, {"config_name": "mwl", "data_files": [{"split": "train", "path": "mwl/*"}]}, {"config_name": "my", "data_files": [{"split": "train", "path": "my/*"}]}, {"config_name": "myv", "data_files": [{"split": "train", "path": "myv/*"}]}, {"config_name": "mzn", "data_files": [{"split": "train", "path": "mzn/*"}]}, {"config_name": "nah", "data_files": [{"split": "train", "path": "nah/*"}]}, {"config_name": "nap", "data_files": [{"split": "train", "path": "nap/*"}]}, {"config_name": "nds", "data_files": [{"split": "train", "path": "nds/*"}]}, {"config_name": "nds-nl", "data_files": [{"split": "train", "path": "nds-nl/*"}]}, {"config_name": "ne", "data_files": [{"split": "train", "path": "ne/*"}]}, {"config_name": "new", "data_files": [{"split": "train", "path": "new/*"}]}, {"config_name": "nia", "data_files": [{"split": "train", "path": "nia/*"}]}, {"config_name": "nl", "data_files": [{"split": "train", "path": "nl/*"}]}, {"config_name": "nn", "data_files": [{"split": "train", "path": "nn/*"}]}, {"config_name": "no", "data_files": [{"split": "train", "path": "no/*"}]}, {"config_name": "nov", "data_files": [{"split": "train", "path": "nov/*"}]}, {"config_name": "nqo", "data_files": [{"split": "train", "path": "nqo/*"}]}, {"config_name": "nrm", "data_files": [{"split": "train", "path": "nrm/*"}]}, {"config_name": "nso", "data_files": [{"split": "train", "path": "nso/*"}]}, {"config_name": "nv", "data_files": [{"split": "train", "path": "nv/*"}]}, {"config_name": "ny", "data_files": [{"split": "train", "path": "ny/*"}]}, {"config_name": "oc", "data_files": [{"split": "train", "path": "oc/*"}]}, {"config_name": "olo", "data_files": [{"split": "train", "path": "olo/*"}]}, {"config_name": "om", "data_files": [{"split": "train", "path": "om/*"}]}, {"config_name": "or", "data_files": [{"split": "train", "path": "or/*"}]}, {"config_name": "os", "data_files": [{"split": "train", "path": "os/*"}]}, {"config_name": "pa", "data_files": [{"split": "train", "path": "pa/*"}]}, {"config_name": "pag", "data_files": [{"split": "train", "path": "pag/*"}]}, {"config_name": "pam", "data_files": [{"split": "train", "path": "pam/*"}]}, {"config_name": "pap", "data_files": [{"split": "train", "path": "pap/*"}]}, {"config_name": "pcd", "data_files": [{"split": "train", "path": "pcd/*"}]}, {"config_name": "pcm", "data_files": [{"split": "train", "path": "pcm/*"}]}, {"config_name": "pdc", "data_files": [{"split": "train", "path": "pdc/*"}]}, {"config_name": "pfl", "data_files": [{"split": "train", "path": "pfl/*"}]}, {"config_name": "pi", "data_files": [{"split": "train", "path": "pi/*"}]}, {"config_name": "pih", "data_files": [{"split": "train", "path": "pih/*"}]}, {"config_name": "pl", "data_files": [{"split": "train", "path": "pl/*"}]}, {"config_name": "pms", "data_files": [{"split": "train", "path": "pms/*"}]}, {"config_name": "pnb", "data_files": [{"split": "train", "path": "pnb/*"}]}, {"config_name": "pnt", "data_files": [{"split": "train", "path": "pnt/*"}]}, {"config_name": "ps", "data_files": [{"split": "train", "path": "ps/*"}]}, {"config_name": "pt", "data_files": [{"split": "train", "path": "pt/*"}]}, {"config_name": "pwn", "data_files": [{"split": "train", "path": "pwn/*"}]}, {"config_name": "qu", "data_files": [{"split": "train", "path": "qu/*"}]}, {"config_name": "rm", "data_files": [{"split": "train", "path": "rm/*"}]}, {"config_name": "rmy", "data_files": [{"split": "train", "path": "rmy/*"}]}, {"config_name": "rn", "data_files": [{"split": "train", "path": "rn/*"}]}, {"config_name": "ro", "data_files": [{"split": "train", "path": "ro/*"}]}, {"config_name": "roa-rup", "data_files": [{"split": "train", "path": "roa-rup/*"}]}, {"config_name": "roa-tara", "data_files": [{"split": "train", "path": "roa-tara/*"}]}, {"config_name": "ru", "data_files": [{"split": "train", "path": "ru/*"}]}, {"config_name": "rue", "data_files": [{"split": "train", "path": "rue/*"}]}, {"config_name": "rw", "data_files": [{"split": "train", "path": "rw/*"}]}, {"config_name": "sa", "data_files": [{"split": "train", "path": "sa/*"}]}, {"config_name": "sah", "data_files": [{"split": "train", "path": "sah/*"}]}, {"config_name": "sat", "data_files": [{"split": "train", "path": "sat/*"}]}, {"config_name": "sc", "data_files": [{"split": "train", "path": "sc/*"}]}, {"config_name": "scn", "data_files": [{"split": "train", "path": "scn/*"}]}, {"config_name": "sco", "data_files": [{"split": "train", "path": "sco/*"}]}, {"config_name": "sd", "data_files": [{"split": "train", "path": "sd/*"}]}, {"config_name": "se", "data_files": [{"split": "train", "path": "se/*"}]}, {"config_name": "sg", "data_files": [{"split": "train", "path": "sg/*"}]}, {"config_name": "sh", "data_files": [{"split": "train", "path": "sh/*"}]}, {"config_name": "shi", "data_files": [{"split": "train", "path": "shi/*"}]}, {"config_name": "shn", "data_files": [{"split": "train", "path": "shn/*"}]}, {"config_name": "si", "data_files": [{"split": "train", "path": "si/*"}]}, {"config_name": "simple", "data_files": [{"split": "train", "path": "simple/*"}]}, {"config_name": "sk", "data_files": [{"split": "train", "path": "sk/*"}]}, {"config_name": "skr", "data_files": [{"split": "train", "path": "skr/*"}]}, {"config_name": "sl", "data_files": [{"split": "train", "path": "sl/*"}]}, {"config_name": "sm", "data_files": [{"split": "train", "path": "sm/*"}]}, {"config_name": "smn", "data_files": [{"split": "train", "path": "smn/*"}]}, {"config_name": "sn", "data_files": [{"split": "train", "path": "sn/*"}]}, {"config_name": "so", "data_files": [{"split": "train", "path": "so/*"}]}, {"config_name": "sq", "data_files": [{"split": "train", "path": "sq/*"}]}, {"config_name": "sr", "data_files": [{"split": "train", "path": "sr/*"}]}, {"config_name": "srn", "data_files": [{"split": "train", "path": "srn/*"}]}, {"config_name": "ss", "data_files": [{"split": "train", "path": "ss/*"}]}, {"config_name": "st", "data_files": [{"split": "train", "path": "st/*"}]}, {"config_name": "stq", "data_files": [{"split": "train", "path": "stq/*"}]}, {"config_name": "su", "data_files": [{"split": "train", "path": "su/*"}]}, {"config_name": "sv", "data_files": [{"split": "train", "path": "sv/*"}]}, {"config_name": "sw", "data_files": [{"split": "train", "path": "sw/*"}]}, {"config_name": "szl", "data_files": [{"split": "train", "path": "szl/*"}]}, {"config_name": "szy", "data_files": [{"split": "train", "path": "szy/*"}]}, {"config_name": "ta", "data_files": [{"split": "train", "path": "ta/*"}]}, {"config_name": "tay", "data_files": [{"split": "train", "path": "tay/*"}]}, {"config_name": "tcy", "data_files": [{"split": "train", "path": "tcy/*"}]}, {"config_name": "te", "data_files": [{"split": "train", "path": "te/*"}]}, {"config_name": "tet", "data_files": [{"split": "train", "path": "tet/*"}]}, {"config_name": "tg", "data_files": [{"split": "train", "path": "tg/*"}]}, {"config_name": "th", "data_files": [{"split": "train", "path": "th/*"}]}, {"config_name": "ti", "data_files": [{"split": "train", "path": "ti/*"}]}, {"config_name": "tk", "data_files": [{"split": "train", "path": "tk/*"}]}, {"config_name": "tl", "data_files": [{"split": "train", "path": "tl/*"}]}, {"config_name": "tly", "data_files": [{"split": "train", "path": "tly/*"}]}, {"config_name": "tn", "data_files": [{"split": "train", "path": "tn/*"}]}, {"config_name": "to", "data_files": [{"split": "train", "path": "to/*"}]}, {"config_name": "tpi", "data_files": [{"split": "train", "path": "tpi/*"}]}, {"config_name": "tr", "data_files": [{"split": "train", "path": "tr/*"}]}, {"config_name": "trv", "data_files": [{"split": "train", "path": "trv/*"}]}, {"config_name": "ts", "data_files": [{"split": "train", "path": "ts/*"}]}, {"config_name": "tt", "data_files": [{"split": "train", "path": "tt/*"}]}, {"config_name": "tum", "data_files": [{"split": "train", "path": "tum/*"}]}, {"config_name": "tw", "data_files": [{"split": "train", "path": "tw/*"}]}, {"config_name": "ty", "data_files": [{"split": "train", "path": "ty/*"}]}, {"config_name": "tyv", "data_files": [{"split": "train", "path": "tyv/*"}]}, {"config_name": "udm", "data_files": [{"split": "train", "path": "udm/*"}]}, {"config_name": "ug", "data_files": [{"split": "train", "path": "ug/*"}]}, {"config_name": "uk", "data_files": [{"split": "train", "path": "uk/*"}]}, {"config_name": "ur", "data_files": [{"split": "train", "path": "ur/*"}]}, {"config_name": "uz", "data_files": [{"split": "train", "path": "uz/*"}]}, {"config_name": "ve", "data_files": [{"split": "train", "path": "ve/*"}]}, {"config_name": "vec", "data_files": [{"split": "train", "path": "vec/*"}]}, {"config_name": "vep", "data_files": [{"split": "train", "path": "vep/*"}]}, {"config_name": "vi", "data_files": [{"split": "train", "path": "vi/*"}]}, {"config_name": "vls", "data_files": [{"split": "train", "path": "vls/*"}]}, {"config_name": "vo", "data_files": [{"split": "train", "path": "vo/*"}]}, {"config_name": "wa", "data_files": [{"split": "train", "path": "wa/*"}]}, {"config_name": "war", "data_files": [{"split": "train", "path": "war/*"}]}, {"config_name": "wo", "data_files": [{"split": "train", "path": "wo/*"}]}, {"config_name": "wuu", "data_files": [{"split": "train", "path": "wuu/*"}]}, {"config_name": "xal", "data_files": [{"split": "train", "path": "xal/*"}]}, {"config_name": "xh", "data_files": [{"split": "train", "path": "xh/*"}]}, {"config_name": "xmf", "data_files": [{"split": "train", "path": "xmf/*"}]}, {"config_name": "yi", "data_files": [{"split": "train", "path": "yi/*"}]}, {"config_name": "yo", "data_files": [{"split": "train", "path": "yo/*"}]}, {"config_name": "za", "data_files": [{"split": "train", "path": "za/*"}]}, {"config_name": "zea", "data_files": [{"split": "train", "path": "zea/*"}]}, {"config_name": "zh", "data_files": [{"split": "train", "path": "zh/*"}]}, {"config_name": "zh-classical", "data_files": [{"split": "train", "path": "zh-classical/*"}]}, {"config_name": "zh-min-nan", "data_files": [{"split": "train", "path": "zh-min-nan/*"}]}, {"config_name": "zh-yue", "data_files": [{"split": "train", "path": "zh-yue/*"}]}, {"config_name": "zu", "data_files": [{"split": "train", "path": "zu/*"}]}]}
2024-01-26T13:30:32+00:00
[]
[]
TAGS #region-us
Multilingual Embeddings for Wikipedia in 300+ Languages ======================================================= This dataset contains the wikimedia/wikipedia dataset dump from 2023-11-01 from Wikipedia in all 300+ languages. The individual articles have been chunked and embedded with the state-of-the-art multilingual Cohere Embed V3 embedding model. This enables an easy way to semantically search across all of Wikipedia or to use it as a knowledge source for your RAG application. In total is it close to 250M paragraphs / embeddings. You can also use the model to perform cross-lingual search: Enter your search query in any language and get the most relevant results back. Loading the dataset ------------------- ### Loading the document embeddings The 'corpus' split contains all document embeddings of the corpus. You can either load the dataset like this: Or you can also stream it without downloading it before: Note, depending on the language, the download can be quite large. Search ------ A full search example (on the first 1,000 paragraphs): Overview -------- The following table contains all language codes together with the total numbers of passages.
[ "### Loading the document embeddings\n\n\nThe 'corpus' split contains all document embeddings of the corpus.\n\n\nYou can either load the dataset like this:\n\n\nOr you can also stream it without downloading it before:\n\n\nNote, depending on the language, the download can be quite large.\n\n\nSearch\n------\n\n\nA full search example (on the first 1,000 paragraphs):\n\n\nOverview\n--------\n\n\nThe following table contains all language codes together with the total numbers of passages." ]
[ "TAGS\n#region-us \n", "### Loading the document embeddings\n\n\nThe 'corpus' split contains all document embeddings of the corpus.\n\n\nYou can either load the dataset like this:\n\n\nOr you can also stream it without downloading it before:\n\n\nNote, depending on the language, the download can be quite large.\n\n\nSearch\n------\n\n\nA full search example (on the first 1,000 paragraphs):\n\n\nOverview\n--------\n\n\nThe following table contains all language codes together with the total numbers of passages." ]
1015ee38bd8a36549b344008f7a49af72956a7fe
# PIE Dataset Card for "aae2" This is a [PyTorch-IE](https://github.com/ChristophAlt/pytorch-ie) wrapper for the Argument Annotated Essays v2 (AAE2) dataset ([paper](https://aclanthology.org/J17-3005.pdf) and [homepage](https://tudatalib.ulb.tu-darmstadt.de/handle/tudatalib/2422)). Since the AAE2 dataset is published in the [BRAT standoff format](https://brat.nlplab.org/standoff.html), this dataset builder is based on the [PyTorch-IE brat dataset loading script](https://huggingface.co/datasets/pie/brat). Therefore, the `aae2` dataset as described here follows the data structure from the [PIE brat dataset card](https://huggingface.co/datasets/pie/brat). ### Dataset Summary Argument Annotated Essays Corpus (AAEC) ([Stab and Gurevych, 2017](https://aclanthology.org/J17-3005.pdf)) contains student essays. A stance for a controversial theme is expressed by a major claim component as well as claim components, and premise components justify or refute the claims. Attack and support labels are defined as relations. The span covers a statement, *which can stand in isolation as a complete sentence*, according to the AAEC annotation guidelines. All components are annotated with minimum boundaries of a clause or sentence excluding so-called "shell" language such as *On the other hand* and *Hence*. (Morio et al., 2022, p. 642) There is no premise that links to another premise or claim in a different paragraph. That means, an argumentation tree structure is complete within each paragraph. Therefore, it is possible to train a model on the full documents or just at the paragraph-level which is usually less memory-exhaustive (Eger et al., 2017, p. 16). ### Supported Tasks and Leaderboards - **Tasks**: Argumentation Mining, Component Identification, Component Classification, Structure Identification - **Leaderboard:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Languages The language in the dataset is English (persuasive essays). ### Dataset Variants The `aae2` dataset comes in a single version (`default`) with `BratDocumentWithMergedSpans` as document type. Note, that this in contrast to the base brat dataset, where the document type for the `default` variant is `BratDocument`. The reason is that the AAE2 dataset has already been published with only single-fragment spans. Without any need to merge fragments, the document type `BratDocumentWithMergedSpans` is easier to handle for most of the task modules. ### Data Schema See [PIE-Brat Data Schema](https://huggingface.co/datasets/pie/brat#data-schema). ### Usage ```python from pie_datasets import load_dataset, builders # load default version datasets = load_dataset("pie/aae2") doc = datasets["train"][0] assert isinstance(doc, builders.brat.BratDocumentWithMergedSpans) ``` ### Data Splits | Statistics | Train | Test | | ---------------------------------------------------------------- | -------------------------: | -----------------------: | | No. of document | 322 | 80 | | Components <br/>- `MajorClaim`<br/>- `Claim`<br/>- `Premise` | <br/>598<br/>1202<br/>3023 | <br/>153<br/>304<br/>809 | | Relations\*<br/>- `supports`<br/>- `attacks` | <br/>3820<br/>405 | <br/>1021<br/>92 | \* included all relations between claims and premises and all claim attributions. See further statistics in Stab & Gurevych (2017), p. 650, Table A.1. ### Label Descriptions #### Components | Components | Count | Percentage | | ------------ | ----: | ---------: | | `MajorClaim` | 751 | 12.3 % | | `Claim` | 1506 | 24.7 % | | `Premise` | 3832 | 62.9 % | - `MajorClaim` is the root node of the argumentation structure and represents the author’s standpoint on the topic. Essay bodies either support or attack the author’s standpoint expressed in the major claim. The major claim can be mentioned multiple times in a single document. - `Claim` constitutes the central component of each argument. Each one has at least one premise and takes stance attribute values "for" or "against" with regarding the major claim. - `Premise` is the reasons of the argument; either linked to claim or another premise. **Note that** relations between `MajorClaim` and `Claim` were not annotated; however, each claim is annotated with an `Attribute` annotation with value `for` or `against` - which indicates the relation between itself and `MajorClaim`. In addition, when two non-related `Claim` 's appear in one paragraph, there is also no relations to one another. #### Relations | Relations | Count | Percentage | | ------------------- | ----: | ---------: | | support: `supports` | 3613 | 94.3 % | | attack: `attacks` | 219 | 5.7 % | - "Each premise `p` has one **outgoing relation** (i.e., there is a relation that has p as source component) and none or several **incoming relations** (i.e., there can be a relation with `p` as target component)." - "A `Claim` can exhibit several **incoming relations** but no **outgoing relation**." (S&G, 2017, p. 68) - "The relations from the claims of the arguments to the major claim are dotted since we will not explicitly annotated them. The relation of each argument to the major claim is indicated by a stance attribute of each claim. This attribute can either be for or against as illustrated in figure 1.4." (Stab & Gurevych, *Guidelines for Annotating Argumentation Structures in Persuasive Essays*, 2015, p. 5) See further description in Stab & Gurevych 2017, p.627 and the [annotation guideline](https://github.com/ArneBinder/pie-datasets/blob/db94035602610cefca2b1678aa2fe4455c96155d/data/datasets/ArgumentAnnotatedEssays-2.0/guideline.pdf). ### Document Converters The dataset provides document converters for the following target document types: - `pytorch_ie.documents.TextDocumentWithLabeledSpansAndBinaryRelations` with layers: - `labeled_spans`: `LabeledSpan` annotations, converted from `BratDocumentWithMergedSpans`'s `spans` - labels: `MajorClaim`, `Claim`, `Premise` - `binary_relations`: `BinaryRelation` annotations, converted from `BratDocumentWithMergedSpans`'s `relations` - there are two conversion methods that convert `Claim` attributes to their relations to `MajorClaim` (also see the label-count changes after this relation conversion [here below](#label-counts-after-document-converter)): - `connect_first` (default setting): - build a `supports` or `attacks` relation from each `Claim` to the first `MajorClaim` depending on the `Claim`'s attribute (`for` or `against`), and - build a `semantically_same` relation between following `MajorClaim` to the first `MajorClaim` - `connect_all` - build a `supports` or `attacks` relation from each `Claim` to every `MajorClaim` - no relations between each `MajorClaim` - labels: `supports`, `attacks`, and `semantically_same` if `connect_first` - `pytorch_ie.documents.TextDocumentWithLabeledSpansBinaryRelationsAndLabeledPartitions` with layers: - `labeled_spans`, as above - `binary_relations`, as above - `labeled_partitions`, `LabeledSpan` annotations, created from splitting `BratDocumentWithMergedSpans`'s `text` at new lines (`\n`). - every partition is labeled as `paragraph` See [here](https://github.com/ChristophAlt/pytorch-ie/blob/main/src/pytorch_ie/documents.py) for the document type definitions. #### Label Statistics after Document Conversion When converting from `BratDocumentWithMergedSpan` to `TextDocumentWithLabeledSpansAndBinaryRelations` and `TextDocumentWithLabeledSpansBinaryRelationsAndLabeledPartitions`, we apply a relation-conversion method (see above) that changes the label counts for the relations, as follows: 1. `connect_first` (default): | Relations | Count | Percentage | | -------------------------- | ----: | ---------: | | support: `supports` | 4841 | 85.1 % | | attack: `attacks` | 497 | 8.7 % | | other: `semantically_same` | 349 | 6.2 % | 2. `connect_all` | Relations | Count | Percentage | | ------------------- | ----: | ---------: | | support: `supports` | 5958 | 89.3 % | | attack: `attacks` | 715 | 10.7 % | ## Dataset Creation ### Curation Rationale "The identification of argumentation structures involves several subtasks like separating argumentative from non-argumentative text units (Moens et al. 2007; Florou et al. 2013), classifying argument components into claims and premises (Mochales-Palau and Moens 2011; Rooney, Wang, and Browne 2012; Stab and Gurevych 2014b), and identifying argumentative relations (Mochales-Palau and Moens 2009; Peldszus 2014; Stab and Gurevych 2014b). However, an approach that covers all subtasks is still missing. However, an approach that covers all subtasks is still missing. Furthermore, most approaches operate locally and do not optimize the global argumentation structure. "In addition, to the lack of end-to-end approaches for parsing argumentation structures, there are relatively few corpora annotated with argumentation structures at the discourse-level." (p. 621) "Our primary motivation for this work is to create argument analysis methods for argumentative writing support systems and to achieve a better understanding of argumentation structures." (p. 622) ### Source Data Persuasive essays were collected from [essayforum.com](https://essayforum.com/) (See essay prompts, along with the essay's `id`'s [here](https://github.com/ArneBinder/pie-datasets/blob/db94035602610cefca2b1678aa2fe4455c96155d/data/datasets/ArgumentAnnotatedEssays-2.0/prompts.csv)). #### Initial Data Collection and Normalization "We randomly selected 402 English essays with a description of the writing prompt from essayforum.com. This online forum is an active community that provides correction and feedback about different texts such as research papers, essays, or poetry. For example, students post their essays in order to receive feedback about their writing skills while preparing for standardized language tests. The corpus includes 7,116 sentences with 147,271 tokens." (p. 630) #### Who are the source language producers? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Annotations #### Annotation process The annotation were done using BRAT Rapid Annotation Tool ([Stenetorp et al., 2012](https://aclanthology.org/E12-2021/)). All three annotators independently annotated a random subset of 80 essays. The remaining 322 essays were annotated by the expert annotator. The authors evaluated the inter-annotator agreement using observed agreement and Fleiss’ κ (Fleiss 1971), on each label on each sub-tasks, namely, component identification, component classification, and relation identification. The results were reported in their [paper](https://aclanthology.org/J17-3005.pdf) in Tables 2-4. #### Who are the annotators? Three non-native speakers; one of the three being an expert annotator. ### Personal and Sensitive Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Considerations for Using the Data ### Social Impact of Dataset "\[Computational Argumentation\] have broad application potential in various areas such as legal decision support (Mochales-Palau and Moens 2009), information retrieval (Carstens and Toni 2015), policy making (Sardianos et al. 2015), and debating technologies (Levy et al. 2014; Rinott et al. 2015)." (p. 619) ### Discussion of Biases [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Other Known Limitations The relations between claims and major claims are not explicitly annotated. "The proportion of non-argumentative text amounts to 47,474 tokens (32.2%) and 1,631 sentences (22.9%). The number of sentences with several argument components is 583, of which 302 include several components with different types (e.g., a claim followed by premise)... \[T\]he identification of argument components requires the separation of argumentative from non-argumentative text units and the recognition of component boundaries at the token level...The proportion of paragraphs with unlinked argument components (e.g., unsupported claims without incoming relations) is 421 (23%). Thus, methods that link all argument components in a paragraph are only of limited use for identifying the argumentation structures in our corpus. "Most of the arguments are convergent—that is, the depth of the argument is 1. The number of arguments with serial structure is 236 (20.9%)." (p. 634) ## Additional Information ### Dataset Curators [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Licensing Information **License**: [License description by TU Darmstadt](https://tudatalib.ulb.tu-darmstadt.de/bitstream/handle/tudatalib/2422/arg_annotated_essays_v2_license.pdf?sequence=2&isAllowed=y) **Funding**: This work has been supported by the Volkswagen Foundation as part of the Lichtenberg-Professorship Program under grant no. I/82806 and by the German Federal Ministry of Education and Research (BMBF) as a part of the Software Campus project AWS under grant no. 01—S12054. ### Citation Information ``` @article{stab2017parsing, title={Parsing argumentation structures in persuasive essays}, author={Stab, Christian and Gurevych, Iryna}, journal={Computational Linguistics}, volume={43}, number={3}, pages={619--659}, year={2017}, publisher={MIT Press One Rogers Street, Cambridge, MA 02142-1209, USA journals-info~…} } ``` ``` @misc{https://tudatalib.ulb.tu-darmstadt.de/handle/tudatalib/2422, url = { https://tudatalib.ulb.tu-darmstadt.de/handle/tudatalib/2422 }, author = { Stab, Christian and Gurevych, Iryna }, keywords = { Argument Mining, 409-06 Informationssysteme, Prozess- und Wissensmanagement, 004 }, publisher = { Technical University of Darmstadt }, year = { 2017 }, copyright = { License description }, title = { Argument Annotated Essays (version 2) } } ``` ### Contributions Thanks to [@ArneBinder](https://github.com/ArneBinder) and [@idalr](https://github.com/idalr) for adding this dataset.
pie/aae2
[ "region:us" ]
2024-01-11T16:32:44+00:00
{}
2024-01-11T16:33:03+00:00
[]
[]
TAGS #region-us
PIE Dataset Card for "aae2" =========================== This is a PyTorch-IE wrapper for the Argument Annotated Essays v2 (AAE2) dataset (paper and homepage). Since the AAE2 dataset is published in the BRAT standoff format, this dataset builder is based on the PyTorch-IE brat dataset loading script. Therefore, the 'aae2' dataset as described here follows the data structure from the PIE brat dataset card. ### Dataset Summary Argument Annotated Essays Corpus (AAEC) (Stab and Gurevych, 2017) contains student essays. A stance for a controversial theme is expressed by a major claim component as well as claim components, and premise components justify or refute the claims. Attack and support labels are defined as relations. The span covers a statement, *which can stand in isolation as a complete sentence*, according to the AAEC annotation guidelines. All components are annotated with minimum boundaries of a clause or sentence excluding so-called "shell" language such as *On the other hand* and *Hence*. (Morio et al., 2022, p. 642) There is no premise that links to another premise or claim in a different paragraph. That means, an argumentation tree structure is complete within each paragraph. Therefore, it is possible to train a model on the full documents or just at the paragraph-level which is usually less memory-exhaustive (Eger et al., 2017, p. 16). ### Supported Tasks and Leaderboards * Tasks: Argumentation Mining, Component Identification, Component Classification, Structure Identification * Leaderboard: ### Languages The language in the dataset is English (persuasive essays). ### Dataset Variants The 'aae2' dataset comes in a single version ('default') with 'BratDocumentWithMergedSpans' as document type. Note, that this in contrast to the base brat dataset, where the document type for the 'default' variant is 'BratDocument'. The reason is that the AAE2 dataset has already been published with only single-fragment spans. Without any need to merge fragments, the document type 'BratDocumentWithMergedSpans' is easier to handle for most of the task modules. ### Data Schema See PIE-Brat Data Schema. ### Usage ### Data Splits \* included all relations between claims and premises and all claim attributions. See further statistics in Stab & Gurevych (2017), p. 650, Table A.1. ### Label Descriptions #### Components * 'MajorClaim' is the root node of the argumentation structure and represents the author’s standpoint on the topic. Essay bodies either support or attack the author’s standpoint expressed in the major claim. The major claim can be mentioned multiple times in a single document. * 'Claim' constitutes the central component of each argument. Each one has at least one premise and takes stance attribute values "for" or "against" with regarding the major claim. * 'Premise' is the reasons of the argument; either linked to claim or another premise. Note that relations between 'MajorClaim' and 'Claim' were not annotated; however, each claim is annotated with an 'Attribute' annotation with value 'for' or 'against' - which indicates the relation between itself and 'MajorClaim'. In addition, when two non-related 'Claim' 's appear in one paragraph, there is also no relations to one another. #### Relations * "Each premise 'p' has one outgoing relation (i.e., there is a relation that has p as source component) and none or several incoming relations (i.e., there can be a relation with 'p' as target component)." * "A 'Claim' can exhibit several incoming relations but no outgoing relation." (S&G, 2017, p. 68) * "The relations from the claims of the arguments to the major claim are dotted since we will not explicitly annotated them. The relation of each argument to the major claim is indicated by a stance attribute of each claim. This attribute can either be for or against as illustrated in figure 1.4." (Stab & Gurevych, *Guidelines for Annotating Argumentation Structures in Persuasive Essays*, 2015, p. 5) See further description in Stab & Gurevych 2017, p.627 and the annotation guideline. ### Document Converters The dataset provides document converters for the following target document types: * 'pytorch\_ie.documents.TextDocumentWithLabeledSpansAndBinaryRelations' with layers: + 'labeled\_spans': 'LabeledSpan' annotations, converted from 'BratDocumentWithMergedSpans''s 'spans' - labels: 'MajorClaim', 'Claim', 'Premise' + 'binary\_relations': 'BinaryRelation' annotations, converted from 'BratDocumentWithMergedSpans''s 'relations' - there are two conversion methods that convert 'Claim' attributes to their relations to 'MajorClaim' (also see the label-count changes after this relation conversion here below): * 'connect\_first' (default setting): + build a 'supports' or 'attacks' relation from each 'Claim' to the first 'MajorClaim' depending on the 'Claim''s attribute ('for' or 'against'), and + build a 'semantically\_same' relation between following 'MajorClaim' to the first 'MajorClaim' * 'connect\_all' + build a 'supports' or 'attacks' relation from each 'Claim' to every 'MajorClaim' + no relations between each 'MajorClaim' - labels: 'supports', 'attacks', and 'semantically\_same' if 'connect\_first' * 'pytorch\_ie.documents.TextDocumentWithLabeledSpansBinaryRelationsAndLabeledPartitions' with layers: + 'labeled\_spans', as above + 'binary\_relations', as above + 'labeled\_partitions', 'LabeledSpan' annotations, created from splitting 'BratDocumentWithMergedSpans''s 'text' at new lines ('\n'). - every partition is labeled as 'paragraph' See here for the document type definitions. #### Label Statistics after Document Conversion When converting from 'BratDocumentWithMergedSpan' to 'TextDocumentWithLabeledSpansAndBinaryRelations' and 'TextDocumentWithLabeledSpansBinaryRelationsAndLabeledPartitions', we apply a relation-conversion method (see above) that changes the label counts for the relations, as follows: 1. 'connect\_first' (default): 2. 'connect\_all' Dataset Creation ---------------- ### Curation Rationale "The identification of argumentation structures involves several subtasks like separating argumentative from non-argumentative text units (Moens et al. 2007; Florou et al. 2013), classifying argument components into claims and premises (Mochales-Palau and Moens 2011; Rooney, Wang, and Browne 2012; Stab and Gurevych 2014b), and identifying argumentative relations (Mochales-Palau and Moens 2009; Peldszus 2014; Stab and Gurevych 2014b). However, an approach that covers all subtasks is still missing. However, an approach that covers all subtasks is still missing. Furthermore, most approaches operate locally and do not optimize the global argumentation structure. "In addition, to the lack of end-to-end approaches for parsing argumentation structures, there are relatively few corpora annotated with argumentation structures at the discourse-level." (p. 621) "Our primary motivation for this work is to create argument analysis methods for argumentative writing support systems and to achieve a better understanding of argumentation structures." (p. 622) ### Source Data Persuasive essays were collected from URL (See essay prompts, along with the essay's 'id''s here). #### Initial Data Collection and Normalization "We randomly selected 402 English essays with a description of the writing prompt from URL. This online forum is an active community that provides correction and feedback about different texts such as research papers, essays, or poetry. For example, students post their essays in order to receive feedback about their writing skills while preparing for standardized language tests. The corpus includes 7,116 sentences with 147,271 tokens." (p. 630) #### Who are the source language producers? ### Annotations #### Annotation process The annotation were done using BRAT Rapid Annotation Tool (Stenetorp et al., 2012). All three annotators independently annotated a random subset of 80 essays. The remaining 322 essays were annotated by the expert annotator. The authors evaluated the inter-annotator agreement using observed agreement and Fleiss’ κ (Fleiss 1971), on each label on each sub-tasks, namely, component identification, component classification, and relation identification. The results were reported in their paper in Tables 2-4. #### Who are the annotators? Three non-native speakers; one of the three being an expert annotator. ### Personal and Sensitive Information Considerations for Using the Data --------------------------------- ### Social Impact of Dataset "[Computational Argumentation] have broad application potential in various areas such as legal decision support (Mochales-Palau and Moens 2009), information retrieval (Carstens and Toni 2015), policy making (Sardianos et al. 2015), and debating technologies (Levy et al. 2014; Rinott et al. 2015)." (p. 619) ### Discussion of Biases ### Other Known Limitations The relations between claims and major claims are not explicitly annotated. "The proportion of non-argumentative text amounts to 47,474 tokens (32.2%) and 1,631 sentences (22.9%). The number of sentences with several argument components is 583, of which 302 include several components with different types (e.g., a claim followed by premise)... [T]he identification of argument components requires the separation of argumentative from non-argumentative text units and the recognition of component boundaries at the token level...The proportion of paragraphs with unlinked argument components (e.g., unsupported claims without incoming relations) is 421 (23%). Thus, methods that link all argument components in a paragraph are only of limited use for identifying the argumentation structures in our corpus. "Most of the arguments are convergent—that is, the depth of the argument is 1. The number of arguments with serial structure is 236 (20.9%)." (p. 634) Additional Information ---------------------- ### Dataset Curators ### Licensing Information License: License description by TU Darmstadt Funding: This work has been supported by the Volkswagen Foundation as part of the Lichtenberg-Professorship Program under grant no. I/82806 and by the German Federal Ministry of Education and Research (BMBF) as a part of the Software Campus project AWS under grant no. 01—S12054. ### Contributions Thanks to @ArneBinder and @idalr for adding this dataset.
[ "### Dataset Summary\n\n\nArgument Annotated Essays Corpus (AAEC) (Stab and Gurevych, 2017) contains student essays. A stance for a controversial theme is expressed by a major claim component as well as claim components, and premise components justify or refute the claims. Attack and support labels are defined as relations. The span covers a statement, *which can stand in isolation as a complete sentence*, according to the AAEC annotation guidelines. All components are annotated with minimum boundaries of a clause or sentence excluding so-called \"shell\" language such as *On the other hand* and *Hence*. (Morio et al., 2022, p. 642)\n\n\nThere is no premise that links to another premise or claim in a different paragraph. That means, an argumentation tree structure is complete within each paragraph. Therefore, it is possible to train a model on the full documents or just at the paragraph-level which is usually less memory-exhaustive (Eger et al., 2017, p. 16).", "### Supported Tasks and Leaderboards\n\n\n* Tasks: Argumentation Mining, Component Identification, Component Classification, Structure Identification\n* Leaderboard:", "### Languages\n\n\nThe language in the dataset is English (persuasive essays).", "### Dataset Variants\n\n\nThe 'aae2' dataset comes in a single version ('default') with 'BratDocumentWithMergedSpans' as document type. Note, that this in contrast to the base brat dataset, where the document type for the 'default' variant is 'BratDocument'. The reason is that the AAE2 dataset has already been published with only single-fragment spans. Without any need to merge fragments, the document type 'BratDocumentWithMergedSpans' is easier to handle for most of the task modules.", "### Data Schema\n\n\nSee PIE-Brat Data Schema.", "### Usage", "### Data Splits\n\n\n\n\\* included all relations between claims and premises and all claim attributions.\n\n\nSee further statistics in Stab & Gurevych (2017), p. 650, Table A.1.", "### Label Descriptions", "#### Components\n\n\n\n* 'MajorClaim' is the root node of the argumentation structure and represents the author’s standpoint on the topic. Essay bodies either support or attack the author’s standpoint expressed in the major claim. The major claim can be mentioned multiple times in a single document.\n* 'Claim' constitutes the central component of each argument. Each one has at least one premise and takes stance attribute values \"for\" or \"against\" with regarding the major claim.\n* 'Premise' is the reasons of the argument; either linked to claim or another premise.\n\n\nNote that relations between 'MajorClaim' and 'Claim' were not annotated; however, each claim is annotated with an 'Attribute' annotation with value 'for' or 'against' - which indicates the relation between itself and 'MajorClaim'. In addition, when two non-related 'Claim' 's appear in one paragraph, there is also no relations to one another.", "#### Relations\n\n\n\n* \"Each premise 'p' has one outgoing relation (i.e., there is a relation that has p as source component) and none or several incoming relations (i.e., there can be a relation with 'p' as target component).\"\n* \"A 'Claim' can exhibit several incoming relations but no outgoing relation.\" (S&G, 2017, p. 68)\n* \"The relations from the claims of the arguments to the major claim are dotted since we will not explicitly annotated them. The relation of each argument to the major claim is indicated by a stance attribute of each claim. This attribute can either be for or against as illustrated in figure 1.4.\" (Stab & Gurevych, *Guidelines for Annotating Argumentation Structures in Persuasive Essays*, 2015, p. 5)\n\n\nSee further description in Stab & Gurevych 2017, p.627 and the annotation guideline.", "### Document Converters\n\n\nThe dataset provides document converters for the following target document types:\n\n\n* 'pytorch\\_ie.documents.TextDocumentWithLabeledSpansAndBinaryRelations' with layers:\n\t+ 'labeled\\_spans': 'LabeledSpan' annotations, converted from 'BratDocumentWithMergedSpans''s 'spans'\n\t\t- labels: 'MajorClaim', 'Claim', 'Premise'\n\t+ 'binary\\_relations': 'BinaryRelation' annotations, converted from 'BratDocumentWithMergedSpans''s 'relations'\n\t\t- there are two conversion methods that convert 'Claim' attributes to their relations to 'MajorClaim' (also see the label-count changes after this relation conversion here below):\n\t\t\t* 'connect\\_first' (default setting):\n\t\t\t\t+ build a 'supports' or 'attacks' relation from each 'Claim' to the first 'MajorClaim' depending on the 'Claim''s attribute ('for' or 'against'), and\n\t\t\t\t+ build a 'semantically\\_same' relation between following 'MajorClaim' to the first 'MajorClaim'\n\t\t\t* 'connect\\_all'\n\t\t\t\t+ build a 'supports' or 'attacks' relation from each 'Claim' to every 'MajorClaim'\n\t\t\t\t+ no relations between each 'MajorClaim'\n\t\t- labels: 'supports', 'attacks', and 'semantically\\_same' if 'connect\\_first'\n* 'pytorch\\_ie.documents.TextDocumentWithLabeledSpansBinaryRelationsAndLabeledPartitions' with layers:\n\t+ 'labeled\\_spans', as above\n\t+ 'binary\\_relations', as above\n\t+ 'labeled\\_partitions', 'LabeledSpan' annotations, created from splitting 'BratDocumentWithMergedSpans''s 'text' at new lines ('\\n').\n\t\t- every partition is labeled as 'paragraph'\n\n\nSee here for the document type\ndefinitions.", "#### Label Statistics after Document Conversion\n\n\nWhen converting from 'BratDocumentWithMergedSpan' to 'TextDocumentWithLabeledSpansAndBinaryRelations' and 'TextDocumentWithLabeledSpansBinaryRelationsAndLabeledPartitions',\nwe apply a relation-conversion method (see above) that changes the label counts for the relations, as follows:\n\n\n1. 'connect\\_first' (default):\n\n\n\n2. 'connect\\_all'\n\n\n\nDataset Creation\n----------------", "### Curation Rationale\n\n\n\"The identification of argumentation structures involves several subtasks like separating argumentative from non-argumentative text units (Moens et al. 2007; Florou\net al. 2013), classifying argument components into claims and premises (Mochales-Palau and Moens 2011; Rooney, Wang, and Browne 2012; Stab and Gurevych 2014b),\nand identifying argumentative relations (Mochales-Palau and Moens 2009; Peldszus\n2014; Stab and Gurevych 2014b). However, an approach that covers all subtasks is still\nmissing. However, an approach that covers all subtasks is still\nmissing. Furthermore, most approaches operate locally and do not optimize the global\nargumentation structure.\n\n\n\"In addition,\nto the lack of end-to-end approaches for parsing argumentation structures, there are\nrelatively few corpora annotated with argumentation structures at the discourse-level.\" (p. 621)\n\n\n\"Our primary motivation for this work is to create argument analysis methods\nfor argumentative writing support systems and to achieve a better understanding\nof argumentation structures.\" (p. 622)", "### Source Data\n\n\nPersuasive essays were collected from URL (See essay prompts, along with the essay's 'id''s here).", "#### Initial Data Collection and Normalization\n\n\n\"We randomly selected 402 English essays with a description of the writing prompt from\nURL. This online forum is an active community that provides correction and\nfeedback about different texts such as research papers, essays, or poetry. For example,\nstudents post their essays in order to receive feedback about their writing skills while\npreparing for standardized language tests. The corpus includes 7,116 sentences with\n147,271 tokens.\" (p. 630)", "#### Who are the source language producers?", "### Annotations", "#### Annotation process\n\n\nThe annotation were done using BRAT Rapid Annotation Tool (Stenetorp et al., 2012).\n\n\nAll three annotators independently annotated a random subset of 80 essays. The\nremaining 322 essays were annotated by the expert annotator.\n\n\nThe authors evaluated the inter-annotator agreement using observed agreement and Fleiss’ κ (Fleiss 1971), on each label on each sub-tasks,\nnamely, component identification, component classification, and relation identification.\nThe results were reported in their paper in Tables 2-4.", "#### Who are the annotators?\n\n\nThree non-native speakers; one of the three being an expert annotator.", "### Personal and Sensitive Information\n\n\nConsiderations for Using the Data\n---------------------------------", "### Social Impact of Dataset\n\n\n\"[Computational Argumentation] have\nbroad application potential in various areas such as legal decision support (Mochales-Palau and Moens 2009), information retrieval (Carstens and Toni 2015), policy making (Sardianos et al. 2015), and debating technologies (Levy et al. 2014; Rinott et al.\n2015).\" (p. 619)", "### Discussion of Biases", "### Other Known Limitations\n\n\nThe relations between claims and major claims are not explicitly annotated.\n\n\n\"The proportion of non-argumentative text amounts to 47,474 tokens (32.2%) and\n1,631 sentences (22.9%). The number of sentences with several argument components\nis 583, of which 302 include several components with different types (e.g., a claim followed by premise)...\n[T]he identification of argument components requires the\nseparation of argumentative from non-argumentative text units and the recognition of\ncomponent boundaries at the token level...The proportion of paragraphs with unlinked\nargument components (e.g., unsupported claims without incoming relations) is 421\n(23%). Thus, methods that link all argument components in a paragraph are only of\nlimited use for identifying the argumentation structures in our corpus.\n\n\n\"Most of the arguments are convergent—that is, the depth of the\nargument is 1. The number of arguments with serial structure is 236 (20.9%).\" (p. 634)\n\n\nAdditional Information\n----------------------", "### Dataset Curators", "### Licensing Information\n\n\nLicense: License description by TU Darmstadt\n\n\nFunding: This work has been supported by the\nVolkswagen Foundation as part of the\nLichtenberg-Professorship Program under\ngrant no. I/82806 and by the German Federal\nMinistry of Education and Research (BMBF)\nas a part of the Software Campus project\nAWS under grant no. 01—S12054.", "### Contributions\n\n\nThanks to @ArneBinder and @idalr for adding this dataset." ]
[ "TAGS\n#region-us \n", "### Dataset Summary\n\n\nArgument Annotated Essays Corpus (AAEC) (Stab and Gurevych, 2017) contains student essays. A stance for a controversial theme is expressed by a major claim component as well as claim components, and premise components justify or refute the claims. Attack and support labels are defined as relations. The span covers a statement, *which can stand in isolation as a complete sentence*, according to the AAEC annotation guidelines. All components are annotated with minimum boundaries of a clause or sentence excluding so-called \"shell\" language such as *On the other hand* and *Hence*. (Morio et al., 2022, p. 642)\n\n\nThere is no premise that links to another premise or claim in a different paragraph. That means, an argumentation tree structure is complete within each paragraph. Therefore, it is possible to train a model on the full documents or just at the paragraph-level which is usually less memory-exhaustive (Eger et al., 2017, p. 16).", "### Supported Tasks and Leaderboards\n\n\n* Tasks: Argumentation Mining, Component Identification, Component Classification, Structure Identification\n* Leaderboard:", "### Languages\n\n\nThe language in the dataset is English (persuasive essays).", "### Dataset Variants\n\n\nThe 'aae2' dataset comes in a single version ('default') with 'BratDocumentWithMergedSpans' as document type. Note, that this in contrast to the base brat dataset, where the document type for the 'default' variant is 'BratDocument'. The reason is that the AAE2 dataset has already been published with only single-fragment spans. Without any need to merge fragments, the document type 'BratDocumentWithMergedSpans' is easier to handle for most of the task modules.", "### Data Schema\n\n\nSee PIE-Brat Data Schema.", "### Usage", "### Data Splits\n\n\n\n\\* included all relations between claims and premises and all claim attributions.\n\n\nSee further statistics in Stab & Gurevych (2017), p. 650, Table A.1.", "### Label Descriptions", "#### Components\n\n\n\n* 'MajorClaim' is the root node of the argumentation structure and represents the author’s standpoint on the topic. Essay bodies either support or attack the author’s standpoint expressed in the major claim. The major claim can be mentioned multiple times in a single document.\n* 'Claim' constitutes the central component of each argument. Each one has at least one premise and takes stance attribute values \"for\" or \"against\" with regarding the major claim.\n* 'Premise' is the reasons of the argument; either linked to claim or another premise.\n\n\nNote that relations between 'MajorClaim' and 'Claim' were not annotated; however, each claim is annotated with an 'Attribute' annotation with value 'for' or 'against' - which indicates the relation between itself and 'MajorClaim'. In addition, when two non-related 'Claim' 's appear in one paragraph, there is also no relations to one another.", "#### Relations\n\n\n\n* \"Each premise 'p' has one outgoing relation (i.e., there is a relation that has p as source component) and none or several incoming relations (i.e., there can be a relation with 'p' as target component).\"\n* \"A 'Claim' can exhibit several incoming relations but no outgoing relation.\" (S&G, 2017, p. 68)\n* \"The relations from the claims of the arguments to the major claim are dotted since we will not explicitly annotated them. The relation of each argument to the major claim is indicated by a stance attribute of each claim. This attribute can either be for or against as illustrated in figure 1.4.\" (Stab & Gurevych, *Guidelines for Annotating Argumentation Structures in Persuasive Essays*, 2015, p. 5)\n\n\nSee further description in Stab & Gurevych 2017, p.627 and the annotation guideline.", "### Document Converters\n\n\nThe dataset provides document converters for the following target document types:\n\n\n* 'pytorch\\_ie.documents.TextDocumentWithLabeledSpansAndBinaryRelations' with layers:\n\t+ 'labeled\\_spans': 'LabeledSpan' annotations, converted from 'BratDocumentWithMergedSpans''s 'spans'\n\t\t- labels: 'MajorClaim', 'Claim', 'Premise'\n\t+ 'binary\\_relations': 'BinaryRelation' annotations, converted from 'BratDocumentWithMergedSpans''s 'relations'\n\t\t- there are two conversion methods that convert 'Claim' attributes to their relations to 'MajorClaim' (also see the label-count changes after this relation conversion here below):\n\t\t\t* 'connect\\_first' (default setting):\n\t\t\t\t+ build a 'supports' or 'attacks' relation from each 'Claim' to the first 'MajorClaim' depending on the 'Claim''s attribute ('for' or 'against'), and\n\t\t\t\t+ build a 'semantically\\_same' relation between following 'MajorClaim' to the first 'MajorClaim'\n\t\t\t* 'connect\\_all'\n\t\t\t\t+ build a 'supports' or 'attacks' relation from each 'Claim' to every 'MajorClaim'\n\t\t\t\t+ no relations between each 'MajorClaim'\n\t\t- labels: 'supports', 'attacks', and 'semantically\\_same' if 'connect\\_first'\n* 'pytorch\\_ie.documents.TextDocumentWithLabeledSpansBinaryRelationsAndLabeledPartitions' with layers:\n\t+ 'labeled\\_spans', as above\n\t+ 'binary\\_relations', as above\n\t+ 'labeled\\_partitions', 'LabeledSpan' annotations, created from splitting 'BratDocumentWithMergedSpans''s 'text' at new lines ('\\n').\n\t\t- every partition is labeled as 'paragraph'\n\n\nSee here for the document type\ndefinitions.", "#### Label Statistics after Document Conversion\n\n\nWhen converting from 'BratDocumentWithMergedSpan' to 'TextDocumentWithLabeledSpansAndBinaryRelations' and 'TextDocumentWithLabeledSpansBinaryRelationsAndLabeledPartitions',\nwe apply a relation-conversion method (see above) that changes the label counts for the relations, as follows:\n\n\n1. 'connect\\_first' (default):\n\n\n\n2. 'connect\\_all'\n\n\n\nDataset Creation\n----------------", "### Curation Rationale\n\n\n\"The identification of argumentation structures involves several subtasks like separating argumentative from non-argumentative text units (Moens et al. 2007; Florou\net al. 2013), classifying argument components into claims and premises (Mochales-Palau and Moens 2011; Rooney, Wang, and Browne 2012; Stab and Gurevych 2014b),\nand identifying argumentative relations (Mochales-Palau and Moens 2009; Peldszus\n2014; Stab and Gurevych 2014b). However, an approach that covers all subtasks is still\nmissing. However, an approach that covers all subtasks is still\nmissing. Furthermore, most approaches operate locally and do not optimize the global\nargumentation structure.\n\n\n\"In addition,\nto the lack of end-to-end approaches for parsing argumentation structures, there are\nrelatively few corpora annotated with argumentation structures at the discourse-level.\" (p. 621)\n\n\n\"Our primary motivation for this work is to create argument analysis methods\nfor argumentative writing support systems and to achieve a better understanding\nof argumentation structures.\" (p. 622)", "### Source Data\n\n\nPersuasive essays were collected from URL (See essay prompts, along with the essay's 'id''s here).", "#### Initial Data Collection and Normalization\n\n\n\"We randomly selected 402 English essays with a description of the writing prompt from\nURL. This online forum is an active community that provides correction and\nfeedback about different texts such as research papers, essays, or poetry. For example,\nstudents post their essays in order to receive feedback about their writing skills while\npreparing for standardized language tests. The corpus includes 7,116 sentences with\n147,271 tokens.\" (p. 630)", "#### Who are the source language producers?", "### Annotations", "#### Annotation process\n\n\nThe annotation were done using BRAT Rapid Annotation Tool (Stenetorp et al., 2012).\n\n\nAll three annotators independently annotated a random subset of 80 essays. The\nremaining 322 essays were annotated by the expert annotator.\n\n\nThe authors evaluated the inter-annotator agreement using observed agreement and Fleiss’ κ (Fleiss 1971), on each label on each sub-tasks,\nnamely, component identification, component classification, and relation identification.\nThe results were reported in their paper in Tables 2-4.", "#### Who are the annotators?\n\n\nThree non-native speakers; one of the three being an expert annotator.", "### Personal and Sensitive Information\n\n\nConsiderations for Using the Data\n---------------------------------", "### Social Impact of Dataset\n\n\n\"[Computational Argumentation] have\nbroad application potential in various areas such as legal decision support (Mochales-Palau and Moens 2009), information retrieval (Carstens and Toni 2015), policy making (Sardianos et al. 2015), and debating technologies (Levy et al. 2014; Rinott et al.\n2015).\" (p. 619)", "### Discussion of Biases", "### Other Known Limitations\n\n\nThe relations between claims and major claims are not explicitly annotated.\n\n\n\"The proportion of non-argumentative text amounts to 47,474 tokens (32.2%) and\n1,631 sentences (22.9%). The number of sentences with several argument components\nis 583, of which 302 include several components with different types (e.g., a claim followed by premise)...\n[T]he identification of argument components requires the\nseparation of argumentative from non-argumentative text units and the recognition of\ncomponent boundaries at the token level...The proportion of paragraphs with unlinked\nargument components (e.g., unsupported claims without incoming relations) is 421\n(23%). Thus, methods that link all argument components in a paragraph are only of\nlimited use for identifying the argumentation structures in our corpus.\n\n\n\"Most of the arguments are convergent—that is, the depth of the\nargument is 1. The number of arguments with serial structure is 236 (20.9%).\" (p. 634)\n\n\nAdditional Information\n----------------------", "### Dataset Curators", "### Licensing Information\n\n\nLicense: License description by TU Darmstadt\n\n\nFunding: This work has been supported by the\nVolkswagen Foundation as part of the\nLichtenberg-Professorship Program under\ngrant no. I/82806 and by the German Federal\nMinistry of Education and Research (BMBF)\nas a part of the Software Campus project\nAWS under grant no. 01—S12054.", "### Contributions\n\n\nThanks to @ArneBinder and @idalr for adding this dataset." ]
5f63d55b7dd15a2288c7a8ef9d809a8bbecb6ef1
This dataset was developed by a team from **Skoltech's Intelligent Space Robotics Laboratory**. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/641d7ac93509072bd5c29f23/zCiesV4XzI6swpl6ry-TB.png) The dataset was used to train **LLM for Next-Step Robot Behavior Generation** based on a user command. Note that this model is part of a multi-agent artificial intelligence system for the dog robot described in the **CognitiveDog paper**. The dog robot explores the changing environment and dynamically makes decisions to solve the task set by the user, including through physical interaction with the environment. **Please find full paper preprint on: https://arxiv.org/abs/2401.09388** The core dataset part is **CognitiveDog_original_dataset.json**. In **CognitiveDog_augmented_dataset.json** you can find more samples with different objects used in our research experiment to evaluate and compare model ability to work with seen and unseen objects and environments. Paper preprint BibTeX cite: @misc{lykov2024cognitivedog, title={CognitiveDog: Large Multimodal Model Based System to Translate Vision and Language into Action of Quadruped Robot}, author={Artem Lykov and Mikhail Litvinov and Mikhail Konenkov and Rinat Prochii and Nikita Burtsev and Ali Alridha Abdulkarim and Artem Bazhenov and Vladimir Berman and Dzmitry Tsetserukou}, year={2024}, eprint={2401.09388}, archivePrefix={arXiv}, primaryClass={cs.RO} } Dataset BibTeX cite: @misc{cognitivedog_dataset, title={CognitiveDog\_dataset}, author={Artem Lykov and Mikhail Litvinov and Mikhail Konenkov and Rinat Prochii and Nikita Burtsev and Ali Alridha Abdulkarim and Artem Bazhenov and Vladimir Berman and Dzmitry Tsetserukou}, year={2024}, publisher={Hugging Face}, howpublished={\url{https://huggingface.co/ArtemLykov/CognitiveDog_dataset}} }
ArtemLykov/CognitiveDog_dataset
[ "license:cc-by-4.0", "arxiv:2401.09388", "region:us" ]
2024-01-11T17:09:02+00:00
{"license": "cc-by-4.0"}
2024-01-18T09:45:16+00:00
[ "2401.09388" ]
[]
TAGS #license-cc-by-4.0 #arxiv-2401.09388 #region-us
This dataset was developed by a team from Skoltech's Intelligent Space Robotics Laboratory. !image/png The dataset was used to train LLM for Next-Step Robot Behavior Generation based on a user command. Note that this model is part of a multi-agent artificial intelligence system for the dog robot described in the CognitiveDog paper. The dog robot explores the changing environment and dynamically makes decisions to solve the task set by the user, including through physical interaction with the environment. Please find full paper preprint on: URL The core dataset part is CognitiveDog_original_dataset.json. In CognitiveDog_augmented_dataset.json you can find more samples with different objects used in our research experiment to evaluate and compare model ability to work with seen and unseen objects and environments. Paper preprint BibTeX cite: @misc{lykov2024cognitivedog, title={CognitiveDog: Large Multimodal Model Based System to Translate Vision and Language into Action of Quadruped Robot}, author={Artem Lykov and Mikhail Litvinov and Mikhail Konenkov and Rinat Prochii and Nikita Burtsev and Ali Alridha Abdulkarim and Artem Bazhenov and Vladimir Berman and Dzmitry Tsetserukou}, year={2024}, eprint={2401.09388}, archivePrefix={arXiv}, primaryClass={cs.RO} } Dataset BibTeX cite: @misc{cognitivedog_dataset, title={CognitiveDog\_dataset}, author={Artem Lykov and Mikhail Litvinov and Mikhail Konenkov and Rinat Prochii and Nikita Burtsev and Ali Alridha Abdulkarim and Artem Bazhenov and Vladimir Berman and Dzmitry Tsetserukou}, year={2024}, publisher={Hugging Face}, howpublished={\url{URL }
[]
[ "TAGS\n#license-cc-by-4.0 #arxiv-2401.09388 #region-us \n" ]
8857d7024146cb9a2c4515ab622b97b0b8163cd6
尝试解决"llm repetition problem",使用分词模型对oaast语料进行“结巴化”数据增强,提供更强的重复内容拒绝效果。 Attempts to solve the "llm repetition problem" by using a segmentation model to enhance the oaast corpus with "stuttering" data to provide stronger rejection of duplicate content. 其次,还过滤掉了所有自我认知的微调样本。 Second, it also filters out all the fine-tuned samples of self-cognition. files: - oaast_rm_zh_jieba.jsonl : word level repeat - oaast_rm_zh_sent_jieba.jsonl : sentence level repeat
lenML/oaast_rm_zh_jieba
[ "size_categories:n<1K", "language:zh", "license:apache-2.0", "human-feedback", "region:us" ]
2024-01-11T17:56:30+00:00
{"language": ["zh"], "license": "apache-2.0", "size_categories": ["n<1K"], "tags": ["human-feedback"]}
2024-01-13T07:34:42+00:00
[]
[ "zh" ]
TAGS #size_categories-n<1K #language-Chinese #license-apache-2.0 #human-feedback #region-us
尝试解决"llm repetition problem",使用分词模型对oaast语料进行“结巴化”数据增强,提供更强的重复内容拒绝效果。 Attempts to solve the "llm repetition problem" by using a segmentation model to enhance the oaast corpus with "stuttering" data to provide stronger rejection of duplicate content. 其次,还过滤掉了所有自我认知的微调样本。 Second, it also filters out all the fine-tuned samples of self-cognition. files: - oaast_rm_zh_jieba.jsonl : word level repeat - oaast_rm_zh_sent_jieba.jsonl : sentence level repeat
[]
[ "TAGS\n#size_categories-n<1K #language-Chinese #license-apache-2.0 #human-feedback #region-us \n" ]
fb1c38193b9817b67ce4cf1f34e71848e2c9f220
# Dataset Card for Wikipedia This repo is a fork of the [olm/wikipedia](https://huggingface.co/datasets/olm/wikipedia) repo which itself is a fork of the original Hugging Face Wikipedia repo [here](https://huggingface.co/datasets/wikipedia). This fork modifies `olm/wikipedia` to enable running on lower resourced machines. These changes have been proposed as a [PR with the olm/wikipedia project](https://huggingface.co/datasets/olm/wikipedia/discussions/6). ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** [https://dumps.wikimedia.org](https://dumps.wikimedia.org) - **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Dataset Summary Wikipedia dataset containing cleaned articles of all languages. The datasets are built from the Wikipedia dump (https://dumps.wikimedia.org/) with one split per language. Each example contains the content of one full Wikipedia article with cleaning to strip markdown and unwanted sections (references, etc.). The articles are parsed using the ``mwparserfromhell`` tool. To load this dataset you need to install the following dependencies: ``` pip install mwparserfromhell datasets ``` Then, you can load any subset of Wikipedia per language and per date this way: ```python from datasets import load_dataset load_dataset("neuml/wikipedia", language="en", date="20240101") ``` You can find the full list of languages and dates [here](https://dumps.wikimedia.org/backup-index.html). ### Supported Tasks and Leaderboards The dataset is generally used for Language Modeling. ### Languages You can find the list of languages [here](https://meta.wikimedia.org/wiki/List_of_Wikipedias). ## Dataset Structure ### Data Instances An example looks as follows: ``` {'id': '1', 'url': 'https://simple.wikipedia.org/wiki/April', 'title': 'April', 'text': 'April is the fourth month...' } ``` ### Data Fields The data fields are the same among all configurations: - `id` (`str`): ID of the article. - `url` (`str`): URL of the article. - `title` (`str`): Title of the article. - `text` (`str`): Text content of the article. ### Curation Rationale [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Source Data #### Initial Data Collection and Normalization [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the source language producers? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Annotations #### Annotation process [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the annotators? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Personal and Sensitive Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Discussion of Biases [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Other Known Limitations [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Additional Information ### Dataset Curators [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Licensing Information Most of Wikipedia's text and many of its images are co-licensed under the [Creative Commons Attribution-ShareAlike 3.0 Unported License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_Creative_Commons_Attribution-ShareAlike_3.0_Unported_License)(CC BY-SA) and the [GNU Free Documentation License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_the_GNU_Free_Documentation_License)(GFDL) (unversioned, with no invariant sections, front-cover texts, or back-cover texts). Some text has been imported only under CC BY-SA and CC BY-SA-compatible license and cannot be reused under GFDL; such text will be identified on the page footer, in the page history, or on the discussion page of the article that utilizes the text. ### Citation Information ``` @ONLINE{wikidump, author = "Wikimedia Foundation", title = "Wikimedia Downloads", url = "https://dumps.wikimedia.org" } ```
NeuML/wikipedia
[ "task_categories:text-generation", "task_categories:fill-mask", "task_ids:language-modeling", "task_ids:masked-language-modeling", "annotations_creators:no-annotation", "language_creators:crowdsourced", "multilinguality:multilingual", "size_categories:n<1K", "size_categories:1K<n<10K", "size_categories:10K<n<100K", "size_categories:100K<n<1M", "size_categories:1M<n<10M", "source_datasets:original", "language:aa", "language:ab", "language:ace", "language:af", "language:ak", "language:als", "language:am", "language:an", "language:ang", "language:ar", "language:arc", "language:arz", "language:as", "language:ast", "language:atj", "language:av", "language:ay", "language:az", "language:azb", "language:ba", "language:bar", "language:bcl", "language:be", "language:bg", "language:bh", "language:bi", "language:bjn", "language:bm", "language:bn", "language:bo", "language:bpy", "language:br", "language:bs", "language:bug", "language:bxr", "language:ca", "language:cbk", "language:cdo", "language:ce", "language:ceb", "language:ch", "language:cho", "language:chr", "language:chy", "language:ckb", "language:co", "language:cr", "language:crh", "language:cs", "language:csb", "language:cu", "language:cv", "language:cy", "language:da", "language:de", "language:din", "language:diq", "language:dsb", "language:dty", "language:dv", "language:dz", "language:ee", "language:el", "language:eml", "language:en", "language:eo", "language:es", "language:et", "language:eu", "language:ext", "language:fa", "language:ff", "language:fi", "language:fj", "language:fo", "language:fr", "language:frp", "language:frr", "language:fur", "language:fy", "language:ga", "language:gag", "language:gan", "language:gd", "language:gl", "language:glk", "language:gn", "language:gom", "language:gor", "language:got", "language:gu", "language:gv", "language:ha", "language:hak", "language:haw", "language:he", "language:hi", "language:hif", "language:ho", "language:hr", "language:hsb", "language:ht", "language:hu", "language:hy", "language:ia", "language:id", "language:ie", "language:ig", "language:ii", "language:ik", "language:ilo", "language:inh", "language:io", "language:is", "language:it", "language:iu", "language:ja", "language:jam", "language:jbo", "language:jv", "language:ka", "language:kaa", "language:kab", "language:kbd", "language:kbp", "language:kg", "language:ki", "language:kj", "language:kk", "language:kl", "language:km", "language:kn", "language:ko", "language:koi", "language:krc", "language:ks", "language:ksh", "language:ku", "language:kv", "language:kw", "language:ky", "language:la", "language:lad", "language:lb", "language:lbe", "language:lez", "language:lfn", "language:lg", "language:li", "language:lij", "language:lmo", "language:ln", "language:lo", "language:lrc", "language:lt", "language:ltg", "language:lv", "language:lzh", "language:mai", "language:mdf", "language:mg", "language:mh", "language:mhr", "language:mi", "language:min", "language:mk", "language:ml", "language:mn", "language:mr", "language:mrj", "language:ms", "language:mt", "language:mus", "language:mwl", "language:my", "language:myv", "language:mzn", "language:na", "language:nah", "language:nan", "language:nap", "language:nds", "language:ne", "language:new", "language:ng", "language:nl", "language:nn", "language:no", "language:nov", "language:nrf", "language:nso", "language:nv", "language:ny", "language:oc", "language:olo", "language:om", "language:or", "language:os", "language:pa", "language:pag", "language:pam", "language:pap", "language:pcd", "language:pdc", "language:pfl", "language:pi", "language:pih", "language:pl", "language:pms", "language:pnb", "language:pnt", "language:ps", "language:pt", "language:qu", "language:rm", "language:rmy", "language:rn", "language:ro", "language:ru", "language:rue", "language:rup", "language:rw", "language:sa", "language:sah", "language:sat", "language:sc", "language:scn", "language:sco", "language:sd", "language:se", "language:sg", "language:sgs", "language:sh", "language:si", "language:sk", "language:sl", "language:sm", "language:sn", "language:so", "language:sq", "language:sr", "language:srn", "language:ss", "language:st", "language:stq", "language:su", "language:sv", "language:sw", "language:szl", "language:ta", "language:tcy", "language:tdt", "language:te", "language:tg", "language:th", "language:ti", "language:tk", "language:tl", "language:tn", "language:to", "language:tpi", "language:tr", "language:ts", "language:tt", "language:tum", "language:tw", "language:ty", "language:tyv", "language:udm", "language:ug", "language:uk", "language:ur", "language:uz", "language:ve", "language:vec", "language:vep", "language:vi", "language:vls", "language:vo", "language:vro", "language:wa", "language:war", "language:wo", "language:wuu", "language:xal", "language:xh", "language:xmf", "language:yi", "language:yo", "language:yue", "language:za", "language:zea", "language:zh", "language:zu", "license:cc-by-sa-3.0", "license:gfdl", "region:us" ]
2024-01-11T17:58:59+00:00
{"annotations_creators": ["no-annotation"], "language_creators": ["crowdsourced"], "language": ["aa", "ab", "ace", "af", "ak", "als", "am", "an", "ang", "ar", "arc", "arz", "as", "ast", "atj", "av", "ay", "az", "azb", "ba", "bar", "bcl", "be", "bg", "bh", "bi", "bjn", "bm", "bn", "bo", "bpy", "br", "bs", "bug", "bxr", "ca", "cbk", "cdo", "ce", "ceb", "ch", "cho", "chr", "chy", "ckb", "co", "cr", "crh", "cs", "csb", "cu", "cv", "cy", "da", "de", "din", "diq", "dsb", "dty", "dv", "dz", "ee", "el", "eml", "en", "eo", "es", "et", "eu", "ext", "fa", "ff", "fi", "fj", "fo", "fr", "frp", "frr", "fur", "fy", "ga", "gag", "gan", "gd", "gl", "glk", "gn", "gom", "gor", "got", "gu", "gv", "ha", "hak", "haw", "he", "hi", "hif", "ho", "hr", "hsb", "ht", "hu", "hy", "ia", "id", "ie", "ig", "ii", "ik", "ilo", "inh", "io", "is", "it", "iu", "ja", "jam", "jbo", "jv", "ka", "kaa", "kab", "kbd", "kbp", "kg", "ki", "kj", "kk", "kl", "km", "kn", "ko", "koi", "krc", "ks", "ksh", "ku", "kv", "kw", "ky", "la", "lad", "lb", "lbe", "lez", "lfn", "lg", "li", "lij", "lmo", "ln", "lo", "lrc", "lt", "ltg", "lv", "lzh", "mai", "mdf", "mg", "mh", "mhr", "mi", "min", "mk", "ml", "mn", "mr", "mrj", "ms", "mt", "mus", "mwl", "my", "myv", "mzn", "na", "nah", "nan", "nap", "nds", "ne", "new", "ng", "nl", "nn", "no", "nov", "nrf", "nso", "nv", "ny", "oc", "olo", "om", "or", "os", "pa", "pag", "pam", "pap", "pcd", "pdc", "pfl", "pi", "pih", "pl", "pms", "pnb", "pnt", "ps", "pt", "qu", "rm", "rmy", "rn", "ro", "ru", "rue", "rup", "rw", "sa", "sah", "sat", "sc", "scn", "sco", "sd", "se", "sg", "sgs", "sh", "si", "sk", "sl", "sm", "sn", "so", "sq", "sr", "srn", "ss", "st", "stq", "su", "sv", "sw", "szl", "ta", "tcy", "tdt", "te", "tg", "th", "ti", "tk", "tl", "tn", "to", "tpi", "tr", "ts", "tt", "tum", "tw", "ty", "tyv", "udm", "ug", "uk", "ur", "uz", "ve", "vec", "vep", "vi", "vls", "vo", "vro", "wa", "war", "wo", "wuu", "xal", "xh", "xmf", "yi", "yo", "yue", "za", "zea", "zh", "zu"], "license": ["cc-by-sa-3.0", "gfdl"], "multilinguality": ["multilingual"], "size_categories": ["n<1K", "1K<n<10K", "10K<n<100K", "100K<n<1M", "1M<n<10M"], "source_datasets": ["original"], "task_categories": ["text-generation", "fill-mask"], "task_ids": ["language-modeling", "masked-language-modeling"], "pretty_name": "Wikipedia", "config_names": ["20240101.aa", "20220101.ab", "20240101.ace", "20240101.ady", "20240101.af", "20240101.ak", "20240101.als", "20240101.am", "20240101.an", "20240101.ang", "20240101.ar", "20240101.arc", "20240101.arz", "20240101.as", "20240101.ast", "20240101.atj", "20240101.av", "20240101.ay", "20240101.az", "20240101.azb", "20240101.ba", "20240101.bar", "20240101.bat-smg", "20240101.bcl", "20240101.be", "20240101.be-x-old", "20240101.bg", "20240101.bh", "20240101.bi", "20240101.bjn", "20240101.bm", "20240101.bn", "20240101.bo", "20240101.bpy", "20240101.br", "20240101.bs", "20240101.bug", "20240101.bxr", "20240101.ca", "20240101.cbk-zam", "20240101.cdo", "20240101.ce", "20240101.ceb", "20240101.ch", "20240101.cho", "20240101.chr", "20240101.chy", "20240101.ckb", "20240101.co", "20240101.cr", "20240101.crh", "20240101.cs", "20240101.csb", "20240101.cu", "20240101.cv", "20240101.cy", "20240101.da", "20240101.de", "20240101.din", "20240101.diq", "20240101.dsb", "20240101.dty", "20240101.dv", "20240101.dz", "20240101.ee", "20240101.el", "20240101.eml", "20240101.en", "20240101.eo", "20240101.es", "20240101.et", "20240101.eu", "20240101.ext", "20240101.fa", "20240101.ff", "20240101.fi", "20240101.fiu-vro", "20240101.fj", "20240101.fo", "20240101.fr", "20240101.frp", "20240101.frr", "20240101.fur", "20240101.fy", "20240101.ga", "20240101.gag", "20240101.gan", "20240101.gd", "20240101.gl", "20240101.glk", "20240101.gn", "20240101.gom", "20240101.gor", "20240101.got", "20240101.gu", "20240101.gv", "20240101.ha", "20240101.hak", "20240101.haw", "20240101.he", "20240101.hi", "20240101.hif", "20240101.ho", "20240101.hr", "20240101.hsb", "20240101.ht", "20240101.hu", "20240101.hy", "20240101.ia", "20240101.id", "20240101.ie", "20240101.ig", "20240101.ii", "20240101.ik", "20240101.ilo", "20240101.inh", "20240101.io", "20240101.is", "20240101.it", "20240101.iu", "20240101.ja", "20240101.jam", "20240101.jbo", "20240101.jv", "20240101.ka", "20240101.kaa", "20240101.kab", "20240101.kbd", "20240101.kbp", "20240101.kg", "20240101.ki", "20240101.kj", "20240101.kk", "20240101.kl", "20240101.km", "20240101.kn", "20240101.ko", "20240101.koi", "20240101.krc", "20240101.ks", "20240101.ksh", "20240101.ku", "20240101.kv", "20240101.kw", "20240101.ky", "20240101.la", "20240101.lad", "20240101.lb", "20240101.lbe", "20240101.lez", "20240101.lfn", "20240101.lg", "20240101.li", "20240101.lij", "20240101.lmo", "20240101.ln", "20240101.lo", "20240101.lrc", "20240101.lt", "20240101.ltg", "20240101.lv", "20240101.mai", "20240101.map-bms", "20240101.mdf", "20240101.mg", "20240101.mh", "20240101.mhr", "20240101.mi", "20240101.min", "20240101.mk", "20240101.ml", "20240101.mn", "20240101.mr", "20240101.mrj", "20240101.ms", "20240101.mt", "20240101.mus", "20240101.mwl", "20240101.my", "20240101.myv", "20240101.mzn", "20240101.na", "20240101.nah", "20240101.nap", "20240101.nds", "20240101.nds-nl", "20240101.ne", "20240101.new", "20240101.ng", "20240101.nl", "20240101.nn", "20240101.no", "20240101.nov", "20240101.nrm", "20240101.nso", "20240101.nv", "20240101.ny", "20240101.oc", "20240101.olo", "20240101.om", "20240101.or", "20240101.os", "20240101.pa", "20240101.pag", "20240101.pam", "20240101.pap", "20240101.pcd", "20240101.pdc", "20240101.pfl", "20240101.pi", "20240101.pih", "20240101.pl", "20240101.pms", "20240101.pnb", "20240101.pnt", "20240101.ps", "20240101.pt", "20240101.qu", "20240101.rm", "20240101.rmy", "20240101.rn", "20240101.ro", "20240101.roa-rup", "20240101.roa-tara", "20240101.ru", "20240101.rue", "20240101.rw", "20240101.sa", "20240101.sah", "20240101.sat", "20240101.sc", "20240101.scn", "20240101.sco", "20240101.sd", "20240101.se", "20240101.sg", "20240101.sh", "20240101.si", "20240101.simple", "20240101.sk", "20240101.sl", "20240101.sm", "20240101.sn", "20240101.so", "20240101.sq", "20240101.sr", "20240101.srn", "20240101.ss", "20240101.st", "20240101.stq", "20240101.su", "20240101.sv", "20240101.sw", "20240101.szl", "20240101.ta", "20240101.tcy", "20240101.te", "20240101.tet", "20240101.tg", "20240101.th", "20240101.ti", "20240101.tk", "20240101.tl", "20240101.tn", "20240101.to", "20240101.tpi", "20240101.tr", "20240101.ts", "20240101.tt", "20240101.tum", "20240101.tw", "20240101.ty", "20240101.tyv", "20240101.udm", "20240101.ug", "20240101.uk", "20240101.ur", "20240101.uz", "20240101.ve", "20240101.vec", "20240101.vep", "20240101.vi", "20240101.vls", "20240101.vo", "20240101.wa", "20240101.war", "20240101.wo", "20240101.wuu", "20240101.xal", "20240101.xh", "20240101.xmf", "20240101.yi", "20240101.yo", "20240101.za", "20240101.zea", "20240101.zh", "20240101.zh-classical", "20240101.zh-min-nan", "20240101.zh-yue", "20240101.zu"], "language_bcp47": ["nds-nl"]}
2024-01-11T18:26:35+00:00
[]
[ "aa", "ab", "ace", "af", "ak", "als", "am", "an", "ang", "ar", "arc", "arz", "as", "ast", "atj", "av", "ay", "az", "azb", "ba", "bar", "bcl", "be", "bg", "bh", "bi", "bjn", "bm", "bn", "bo", "bpy", "br", "bs", "bug", "bxr", "ca", "cbk", "cdo", "ce", "ceb", "ch", "cho", "chr", "chy", "ckb", "co", "cr", "crh", "cs", "csb", "cu", "cv", "cy", "da", "de", "din", "diq", "dsb", "dty", "dv", "dz", "ee", "el", "eml", "en", "eo", "es", "et", "eu", "ext", "fa", "ff", "fi", "fj", "fo", "fr", "frp", "frr", "fur", "fy", "ga", "gag", "gan", "gd", "gl", "glk", "gn", "gom", "gor", "got", "gu", "gv", "ha", "hak", "haw", "he", "hi", "hif", "ho", "hr", "hsb", "ht", "hu", "hy", "ia", "id", "ie", "ig", "ii", "ik", "ilo", "inh", "io", "is", "it", "iu", "ja", "jam", "jbo", "jv", "ka", "kaa", "kab", "kbd", "kbp", "kg", "ki", "kj", "kk", "kl", "km", "kn", "ko", "koi", "krc", "ks", "ksh", "ku", "kv", "kw", "ky", "la", "lad", "lb", "lbe", "lez", "lfn", "lg", "li", "lij", "lmo", "ln", "lo", "lrc", "lt", "ltg", "lv", "lzh", "mai", "mdf", "mg", "mh", "mhr", "mi", "min", "mk", "ml", "mn", "mr", "mrj", "ms", "mt", "mus", "mwl", "my", "myv", "mzn", "na", "nah", "nan", "nap", "nds", "ne", "new", "ng", "nl", "nn", "no", "nov", "nrf", "nso", "nv", "ny", "oc", "olo", "om", "or", "os", "pa", "pag", "pam", "pap", "pcd", "pdc", "pfl", "pi", "pih", "pl", "pms", "pnb", "pnt", "ps", "pt", "qu", "rm", "rmy", "rn", "ro", "ru", "rue", "rup", "rw", "sa", "sah", "sat", "sc", "scn", "sco", "sd", "se", "sg", "sgs", "sh", "si", "sk", "sl", "sm", "sn", "so", "sq", "sr", "srn", "ss", "st", "stq", "su", "sv", "sw", "szl", "ta", "tcy", "tdt", "te", "tg", "th", "ti", "tk", "tl", "tn", "to", "tpi", "tr", "ts", "tt", "tum", "tw", "ty", "tyv", "udm", "ug", "uk", "ur", "uz", "ve", "vec", "vep", "vi", "vls", "vo", "vro", "wa", "war", "wo", "wuu", "xal", "xh", "xmf", "yi", "yo", "yue", "za", "zea", "zh", "zu" ]
TAGS #task_categories-text-generation #task_categories-fill-mask #task_ids-language-modeling #task_ids-masked-language-modeling #annotations_creators-no-annotation #language_creators-crowdsourced #multilinguality-multilingual #size_categories-n<1K #size_categories-1K<n<10K #size_categories-10K<n<100K #size_categories-100K<n<1M #size_categories-1M<n<10M #source_datasets-original #language-Afar #language-Abkhazian #language-Achinese #language-Afrikaans #language-Akan #language-Tosk Albanian #language-Amharic #language-Aragonese #language-Old English (ca. 450-1100) #language-Arabic #language-Official Aramaic (700-300 BCE) #language-Egyptian Arabic #language-Assamese #language-Asturian #language-Atikamekw #language-Avaric #language-Aymara #language-Azerbaijani #language-South Azerbaijani #language-Bashkir #language-Bavarian #language-Central Bikol #language-Belarusian #language-Bulgarian #language-bh #language-Bislama #language-Banjar #language-Bambara #language-Bengali #language-Tibetan #language-Bishnupriya #language-Breton #language-Bosnian #language-Buginese #language-Russia Buriat #language-Catalan #language-Chavacano #language-Min Dong Chinese #language-Chechen #language-Cebuano #language-Chamorro #language-Choctaw #language-Cherokee #language-Cheyenne #language-Central Kurdish #language-Corsican #language-Cree #language-Crimean Tatar #language-Czech #language-Kashubian #language-Church Slavic #language-Chuvash #language-Welsh #language-Danish #language-German #language-Dinka #language-Dimli (individual language) #language-Lower Sorbian #language-Dotyali #language-Dhivehi #language-Dzongkha #language-Ewe #language-Modern Greek (1453-) #language-Emiliano-Romagnolo #language-English #language-Esperanto #language-Spanish #language-Estonian #language-Basque #language-Extremaduran #language-Persian #language-Fulah #language-Finnish #language-Fijian #language-Faroese #language-French #language-Arpitan #language-Northern Frisian #language-Friulian #language-Western Frisian #language-Irish #language-Gagauz #language-Gan Chinese #language-Scottish Gaelic #language-Galician #language-Gilaki #language-Guarani #language-Goan Konkani #language-Gorontalo #language-Gothic #language-Gujarati #language-Manx #language-Hausa #language-Hakka Chinese #language-Hawaiian #language-Hebrew #language-Hindi #language-Fiji Hindi #language-Hiri Motu #language-Croatian #language-Upper Sorbian #language-Haitian #language-Hungarian #language-Armenian #language-Interlingua (International Auxiliary Language Association) #language-Indonesian #language-Interlingue #language-Igbo #language-Sichuan Yi #language-Inupiaq #language-Iloko #language-Ingush #language-Ido #language-Icelandic #language-Italian #language-Inuktitut #language-Japanese #language-Jamaican Creole English #language-Lojban #language-Javanese #language-Georgian #language-Kara-Kalpak #language-Kabyle #language-Kabardian #language-Kabiyè #language-Kongo #language-Kikuyu #language-Kuanyama #language-Kazakh #language-Kalaallisut #language-Khmer #language-Kannada #language-Korean #language-Komi-Permyak #language-Karachay-Balkar #language-Kashmiri #language-Kölsch #language-Kurdish #language-Komi #language-Cornish #language-Kirghiz #language-Latin #language-Ladino #language-Luxembourgish #language-Lak #language-Lezghian #language-Lingua Franca Nova #language-Ganda #language-Limburgan #language-Ligurian #language-Lombard #language-Lingala #language-Lao #language-Northern Luri #language-Lithuanian #language-Latgalian #language-Latvian #language-Literary Chinese #language-Maithili #language-Moksha #language-Malagasy #language-Marshallese #language-Eastern Mari #language-Maori #language-Minangkabau #language-Macedonian #language-Malayalam #language-Mongolian #language-Marathi #language-Western Mari #language-Malay (macrolanguage) #language-Maltese #language-Creek #language-Mirandese #language-Burmese #language-Erzya #language-Mazanderani #language-Nauru #language-nah #language-Min Nan Chinese #language-Neapolitan #language-Low German #language-Nepali (macrolanguage) #language-Newari #language-Ndonga #language-Dutch #language-Norwegian Nynorsk #language-Norwegian #language-Novial #language-Jèrriais #language-Pedi #language-Navajo #language-Nyanja #language-Occitan (post 1500) #language-Livvi #language-Oromo #language-Oriya (macrolanguage) #language-Ossetian #language-Panjabi #language-Pangasinan #language-Pampanga #language-Papiamento #language-Picard #language-Pennsylvania German #language-Pfaelzisch #language-Pali #language-Pitcairn-Norfolk #language-Polish #language-Piemontese #language-Western Panjabi #language-Pontic #language-Pushto #language-Portuguese #language-Quechua #language-Romansh #language-Vlax Romani #language-Rundi #language-Romanian #language-Russian #language-Rusyn #language-Macedo-Romanian #language-Kinyarwanda #language-Sanskrit #language-Yakut #language-Santali #language-Sardinian #language-Sicilian #language-Scots #language-Sindhi #language-Northern Sami #language-Sango #language-Samogitian #language-Serbo-Croatian #language-Sinhala #language-Slovak #language-Slovenian #language-Samoan #language-Shona #language-Somali #language-Albanian #language-Serbian #language-Sranan Tongo #language-Swati #language-Southern Sotho #language-Saterfriesisch #language-Sundanese #language-Swedish #language-Swahili (macrolanguage) #language-Silesian #language-Tamil #language-Tulu #language-Tetun Dili #language-Telugu #language-Tajik #language-Thai #language-Tigrinya #language-Turkmen #language-Tagalog #language-Tswana #language-Tonga (Tonga Islands) #language-Tok Pisin #language-Turkish #language-Tsonga #language-Tatar #language-Tumbuka #language-Twi #language-Tahitian #language-Tuvinian #language-Udmurt #language-Uighur #language-Ukrainian #language-Urdu #language-Uzbek #language-Venda #language-Venetian #language-Veps #language-Vietnamese #language-Vlaams #language-Volapük #language-Võro #language-Walloon #language-Waray (Philippines) #language-Wolof #language-Wu Chinese #language-Kalmyk #language-Xhosa #language-Mingrelian #language-Yiddish #language-Yoruba #language-Yue Chinese #language-Zhuang #language-Zeeuws #language-Chinese #language-Zulu #license-cc-by-sa-3.0 #license-gfdl #region-us
# Dataset Card for Wikipedia This repo is a fork of the olm/wikipedia repo which itself is a fork of the original Hugging Face Wikipedia repo here. This fork modifies 'olm/wikipedia' to enable running on lower resourced machines. These changes have been proposed as a PR with the olm/wikipedia project. ## Table of Contents - Dataset Description - Dataset Summary - Supported Tasks and Leaderboards - Languages - Dataset Structure - Data Instances - Data Fields - Data Splits - Dataset Creation - Curation Rationale - Source Data - Annotations - Personal and Sensitive Information - Considerations for Using the Data - Social Impact of Dataset - Discussion of Biases - Other Known Limitations - Additional Information - Dataset Curators - Licensing Information - Citation Information - Contributions ## Dataset Description - Homepage: URL - Repository: - Paper: - Point of Contact: ### Dataset Summary Wikipedia dataset containing cleaned articles of all languages. The datasets are built from the Wikipedia dump (URL with one split per language. Each example contains the content of one full Wikipedia article with cleaning to strip markdown and unwanted sections (references, etc.). The articles are parsed using the ''mwparserfromhell'' tool. To load this dataset you need to install the following dependencies: Then, you can load any subset of Wikipedia per language and per date this way: You can find the full list of languages and dates here. ### Supported Tasks and Leaderboards The dataset is generally used for Language Modeling. ### Languages You can find the list of languages here. ## Dataset Structure ### Data Instances An example looks as follows: ### Data Fields The data fields are the same among all configurations: - 'id' ('str'): ID of the article. - 'url' ('str'): URL of the article. - 'title' ('str'): Title of the article. - 'text' ('str'): Text content of the article. ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information Most of Wikipedia's text and many of its images are co-licensed under the Creative Commons Attribution-ShareAlike 3.0 Unported License(CC BY-SA) and the GNU Free Documentation License(GFDL) (unversioned, with no invariant sections, front-cover texts, or back-cover texts). Some text has been imported only under CC BY-SA and CC BY-SA-compatible license and cannot be reused under GFDL; such text will be identified on the page footer, in the page history, or on the discussion page of the article that utilizes the text.
[ "# Dataset Card for Wikipedia\n\nThis repo is a fork of the olm/wikipedia repo which itself is a fork of the original Hugging Face Wikipedia repo here.\n\nThis fork modifies 'olm/wikipedia' to enable running on lower resourced machines. These changes have been proposed as a PR with the olm/wikipedia project.", "## Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions", "## Dataset Description\n\n- Homepage: URL\n- Repository: \n- Paper: \n- Point of Contact:", "### Dataset Summary\n\nWikipedia dataset containing cleaned articles of all languages. The datasets are built from the Wikipedia dump (URL with one split per language. Each example contains the content of one full Wikipedia article with cleaning to strip markdown and unwanted sections (references, etc.).\n\nThe articles are parsed using the ''mwparserfromhell'' tool.\n\nTo load this dataset you need to install the following dependencies:\n\n\n\nThen, you can load any subset of Wikipedia per language and per date this way:\n\n\n\nYou can find the full list of languages and dates here.", "### Supported Tasks and Leaderboards\n\nThe dataset is generally used for Language Modeling.", "### Languages\n\nYou can find the list of languages here.", "## Dataset Structure", "### Data Instances\n\nAn example looks as follows:", "### Data Fields\n\nThe data fields are the same among all configurations:\n\n- 'id' ('str'): ID of the article.\n- 'url' ('str'): URL of the article.\n- 'title' ('str'): Title of the article.\n- 'text' ('str'): Text content of the article.", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information\n\nMost of Wikipedia's text and many of its images are co-licensed under the Creative Commons Attribution-ShareAlike 3.0 Unported License(CC BY-SA) and the GNU Free Documentation License(GFDL) (unversioned, with no invariant sections, front-cover texts, or back-cover texts). \n\nSome text has been imported only under CC BY-SA and CC BY-SA-compatible license and cannot be reused under GFDL; such text will be identified on the page footer, in the page history, or on the discussion page of the article that utilizes the text." ]
[ "TAGS\n#task_categories-text-generation #task_categories-fill-mask #task_ids-language-modeling #task_ids-masked-language-modeling #annotations_creators-no-annotation #language_creators-crowdsourced #multilinguality-multilingual #size_categories-n<1K #size_categories-1K<n<10K #size_categories-10K<n<100K #size_categories-100K<n<1M #size_categories-1M<n<10M #source_datasets-original #language-Afar #language-Abkhazian #language-Achinese #language-Afrikaans #language-Akan #language-Tosk Albanian #language-Amharic #language-Aragonese #language-Old English (ca. 450-1100) #language-Arabic #language-Official Aramaic (700-300 BCE) #language-Egyptian Arabic #language-Assamese #language-Asturian #language-Atikamekw #language-Avaric #language-Aymara #language-Azerbaijani #language-South Azerbaijani #language-Bashkir #language-Bavarian #language-Central Bikol #language-Belarusian #language-Bulgarian #language-bh #language-Bislama #language-Banjar #language-Bambara #language-Bengali #language-Tibetan #language-Bishnupriya #language-Breton #language-Bosnian #language-Buginese #language-Russia Buriat #language-Catalan #language-Chavacano #language-Min Dong Chinese #language-Chechen #language-Cebuano #language-Chamorro #language-Choctaw #language-Cherokee #language-Cheyenne #language-Central Kurdish #language-Corsican #language-Cree #language-Crimean Tatar #language-Czech #language-Kashubian #language-Church Slavic #language-Chuvash #language-Welsh #language-Danish #language-German #language-Dinka #language-Dimli (individual language) #language-Lower Sorbian #language-Dotyali #language-Dhivehi #language-Dzongkha #language-Ewe #language-Modern Greek (1453-) #language-Emiliano-Romagnolo #language-English #language-Esperanto #language-Spanish #language-Estonian #language-Basque #language-Extremaduran #language-Persian #language-Fulah #language-Finnish #language-Fijian #language-Faroese #language-French #language-Arpitan #language-Northern Frisian #language-Friulian #language-Western Frisian #language-Irish #language-Gagauz #language-Gan Chinese #language-Scottish Gaelic #language-Galician #language-Gilaki #language-Guarani #language-Goan Konkani #language-Gorontalo #language-Gothic #language-Gujarati #language-Manx #language-Hausa #language-Hakka Chinese #language-Hawaiian #language-Hebrew #language-Hindi #language-Fiji Hindi #language-Hiri Motu #language-Croatian #language-Upper Sorbian #language-Haitian #language-Hungarian #language-Armenian #language-Interlingua (International Auxiliary Language Association) #language-Indonesian #language-Interlingue #language-Igbo #language-Sichuan Yi #language-Inupiaq #language-Iloko #language-Ingush #language-Ido #language-Icelandic #language-Italian #language-Inuktitut #language-Japanese #language-Jamaican Creole English #language-Lojban #language-Javanese #language-Georgian #language-Kara-Kalpak #language-Kabyle #language-Kabardian #language-Kabiyè #language-Kongo #language-Kikuyu #language-Kuanyama #language-Kazakh #language-Kalaallisut #language-Khmer #language-Kannada #language-Korean #language-Komi-Permyak #language-Karachay-Balkar #language-Kashmiri #language-Kölsch #language-Kurdish #language-Komi #language-Cornish #language-Kirghiz #language-Latin #language-Ladino #language-Luxembourgish #language-Lak #language-Lezghian #language-Lingua Franca Nova #language-Ganda #language-Limburgan #language-Ligurian #language-Lombard #language-Lingala #language-Lao #language-Northern Luri #language-Lithuanian #language-Latgalian #language-Latvian #language-Literary Chinese #language-Maithili #language-Moksha #language-Malagasy #language-Marshallese #language-Eastern Mari #language-Maori #language-Minangkabau #language-Macedonian #language-Malayalam #language-Mongolian #language-Marathi #language-Western Mari #language-Malay (macrolanguage) #language-Maltese #language-Creek #language-Mirandese #language-Burmese #language-Erzya #language-Mazanderani #language-Nauru #language-nah #language-Min Nan Chinese #language-Neapolitan #language-Low German #language-Nepali (macrolanguage) #language-Newari #language-Ndonga #language-Dutch #language-Norwegian Nynorsk #language-Norwegian #language-Novial #language-Jèrriais #language-Pedi #language-Navajo #language-Nyanja #language-Occitan (post 1500) #language-Livvi #language-Oromo #language-Oriya (macrolanguage) #language-Ossetian #language-Panjabi #language-Pangasinan #language-Pampanga #language-Papiamento #language-Picard #language-Pennsylvania German #language-Pfaelzisch #language-Pali #language-Pitcairn-Norfolk #language-Polish #language-Piemontese #language-Western Panjabi #language-Pontic #language-Pushto #language-Portuguese #language-Quechua #language-Romansh #language-Vlax Romani #language-Rundi #language-Romanian #language-Russian #language-Rusyn #language-Macedo-Romanian #language-Kinyarwanda #language-Sanskrit #language-Yakut #language-Santali #language-Sardinian #language-Sicilian #language-Scots #language-Sindhi #language-Northern Sami #language-Sango #language-Samogitian #language-Serbo-Croatian #language-Sinhala #language-Slovak #language-Slovenian #language-Samoan #language-Shona #language-Somali #language-Albanian #language-Serbian #language-Sranan Tongo #language-Swati #language-Southern Sotho #language-Saterfriesisch #language-Sundanese #language-Swedish #language-Swahili (macrolanguage) #language-Silesian #language-Tamil #language-Tulu #language-Tetun Dili #language-Telugu #language-Tajik #language-Thai #language-Tigrinya #language-Turkmen #language-Tagalog #language-Tswana #language-Tonga (Tonga Islands) #language-Tok Pisin #language-Turkish #language-Tsonga #language-Tatar #language-Tumbuka #language-Twi #language-Tahitian #language-Tuvinian #language-Udmurt #language-Uighur #language-Ukrainian #language-Urdu #language-Uzbek #language-Venda #language-Venetian #language-Veps #language-Vietnamese #language-Vlaams #language-Volapük #language-Võro #language-Walloon #language-Waray (Philippines) #language-Wolof #language-Wu Chinese #language-Kalmyk #language-Xhosa #language-Mingrelian #language-Yiddish #language-Yoruba #language-Yue Chinese #language-Zhuang #language-Zeeuws #language-Chinese #language-Zulu #license-cc-by-sa-3.0 #license-gfdl #region-us \n", "# Dataset Card for Wikipedia\n\nThis repo is a fork of the olm/wikipedia repo which itself is a fork of the original Hugging Face Wikipedia repo here.\n\nThis fork modifies 'olm/wikipedia' to enable running on lower resourced machines. These changes have been proposed as a PR with the olm/wikipedia project.", "## Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions", "## Dataset Description\n\n- Homepage: URL\n- Repository: \n- Paper: \n- Point of Contact:", "### Dataset Summary\n\nWikipedia dataset containing cleaned articles of all languages. The datasets are built from the Wikipedia dump (URL with one split per language. Each example contains the content of one full Wikipedia article with cleaning to strip markdown and unwanted sections (references, etc.).\n\nThe articles are parsed using the ''mwparserfromhell'' tool.\n\nTo load this dataset you need to install the following dependencies:\n\n\n\nThen, you can load any subset of Wikipedia per language and per date this way:\n\n\n\nYou can find the full list of languages and dates here.", "### Supported Tasks and Leaderboards\n\nThe dataset is generally used for Language Modeling.", "### Languages\n\nYou can find the list of languages here.", "## Dataset Structure", "### Data Instances\n\nAn example looks as follows:", "### Data Fields\n\nThe data fields are the same among all configurations:\n\n- 'id' ('str'): ID of the article.\n- 'url' ('str'): URL of the article.\n- 'title' ('str'): Title of the article.\n- 'text' ('str'): Text content of the article.", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information\n\nMost of Wikipedia's text and many of its images are co-licensed under the Creative Commons Attribution-ShareAlike 3.0 Unported License(CC BY-SA) and the GNU Free Documentation License(GFDL) (unversioned, with no invariant sections, front-cover texts, or back-cover texts). \n\nSome text has been imported only under CC BY-SA and CC BY-SA-compatible license and cannot be reused under GFDL; such text will be identified on the page footer, in the page history, or on the discussion page of the article that utilizes the text." ]