sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
| tokens_length
sequencelengths 1
353
| input_texts
sequencelengths 1
40
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
81d4b39e4f6eb282d23fcc1ce338bbf2564e5363 |
by language:
- en
- es | NickyNicky/oasst2_chatml_filter_en_es | [
"language:en",
"language:es",
"region:us"
] | 2024-01-07T01:49:44+00:00 | {"language": ["en", "es"], "dataset_info": {"features": [{"name": "Text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24546606, "num_examples": 9651}], "download_size": 13233493, "dataset_size": 24546606}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-07T02:17:20+00:00 | [] | [
"en",
"es"
] | TAGS
#language-English #language-Spanish #region-us
|
by language:
- en
- es | [] | [
"TAGS\n#language-English #language-Spanish #region-us \n"
] | [
15
] | [
"passage: TAGS\n#language-English #language-Spanish #region-us \n"
] |
e3e8f23b959881b76ebc2f889b1fb114ca978181 | Japanese multi-turn conversation data was generated using Qarasu14B based on Wikipedia data. Available for non commercial use(Because Qarasu14B learned from ShareGPT).
# Model
https://huggingface.co/lightblue/qarasu-14B-chat-plus-unleashed
# Dataset
https://huggingface.co/datasets/izumi-lab/wikipedia-ja-20230720
# Developed by
FreeAI Ltd. Tsuginosuke AI Super Computer(A100 80Gx8)
https://www.free-ai.ltd/ | shi3z/Qarasu_Wikipedia_Multiturn | [
"license:apache-2.0",
"region:us"
] | 2024-01-07T02:09:05+00:00 | {"license": "apache-2.0"} | 2024-01-07T04:26:18+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| Japanese multi-turn conversation data was generated using Qarasu14B based on Wikipedia data. Available for non commercial use(Because Qarasu14B learned from ShareGPT).
# Model
URL
# Dataset
URL
# Developed by
FreeAI Ltd. Tsuginosuke AI Super Computer(A100 80Gx8)
URL | [
"# Model\nURL",
"# Dataset\nURL",
"# Developed by\nFreeAI Ltd. Tsuginosuke AI Super Computer(A100 80Gx8)\nURL"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# Model\nURL",
"# Dataset\nURL",
"# Developed by\nFreeAI Ltd. Tsuginosuke AI Super Computer(A100 80Gx8)\nURL"
] | [
14,
3,
4,
22
] | [
"passage: TAGS\n#license-apache-2.0 #region-us \n# Model\nURL# Dataset\nURL# Developed by\nFreeAI Ltd. Tsuginosuke AI Super Computer(A100 80Gx8)\nURL"
] |
87bf45e711bb0381009d8afb7d945cafc1217c8d | Japanese multi-turn conversation data was generated using Qarasu14B based on Wikipedia data. for non commercial use(Because Qarasu14B learned from ShareGPT)
Human-gpt format conversation dataset that can be learned with Axolotl.
# Based on
https://huggingface.co/datasets/shi3z/Qarasu_Wikipedia_Multiturn
# Model
https://huggingface.co/lightblue/qarasu-14B-chat-plus-unleashed
# Dataset
https://huggingface.co/datasets/izumi-lab/wikipedia-ja-20230720
# Developed by
FreeAI Ltd. Tsuginosuke AI Super Computer(A100 80Gx8)
https://www.free-ai.ltd/ | shi3z/Qarasu_Wikipedia_multiturn_human_gpt_10K | [
"task_categories:conversational",
"size_categories:10K<n<100K",
"language:ja",
"license:apache-2.0",
"region:us"
] | 2024-01-07T02:15:58+00:00 | {"language": ["ja"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["conversational"]} | 2024-01-07T04:26:40+00:00 | [] | [
"ja"
] | TAGS
#task_categories-conversational #size_categories-10K<n<100K #language-Japanese #license-apache-2.0 #region-us
| Japanese multi-turn conversation data was generated using Qarasu14B based on Wikipedia data. for non commercial use(Because Qarasu14B learned from ShareGPT)
Human-gpt format conversation dataset that can be learned with Axolotl.
# Based on
URL
# Model
URL
# Dataset
URL
# Developed by
FreeAI Ltd. Tsuginosuke AI Super Computer(A100 80Gx8)
URL | [
"# Based on\nURL",
"# Model\nURL",
"# Dataset\nURL",
"# Developed by\nFreeAI Ltd. Tsuginosuke AI Super Computer(A100 80Gx8)\nURL"
] | [
"TAGS\n#task_categories-conversational #size_categories-10K<n<100K #language-Japanese #license-apache-2.0 #region-us \n",
"# Based on\nURL",
"# Model\nURL",
"# Dataset\nURL",
"# Developed by\nFreeAI Ltd. Tsuginosuke AI Super Computer(A100 80Gx8)\nURL"
] | [
42,
5,
3,
4,
22
] | [
"passage: TAGS\n#task_categories-conversational #size_categories-10K<n<100K #language-Japanese #license-apache-2.0 #region-us \n# Based on\nURL# Model\nURL# Dataset\nURL# Developed by\nFreeAI Ltd. Tsuginosuke AI Super Computer(A100 80Gx8)\nURL"
] |
046ff06d45096b8b226c606977023983de6f0a9f | # Dataset Card for "cpsc2018"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | kushalps/cpsc2018 | [
"region:us"
] | 2024-01-07T03:01:14+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "1AVB", "1": "AF", "2": "LBBB", "3": "Normal", "4": "PAC", "5": "PVC", "6": "RBBB", "7": "STD", "8": "STE"}}}}], "splits": [{"name": "train", "num_bytes": 2271502441.611, "num_examples": 44327}, {"name": "validation", "num_bytes": 15416122.0, "num_examples": 285}, {"name": "test", "num_bytes": 66362558.867, "num_examples": 1283}], "download_size": 2478695413, "dataset_size": 2353281122.478}} | 2024-01-11T02:38:27+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "cpsc2018"
More Information needed | [
"# Dataset Card for \"cpsc2018\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"cpsc2018\"\n\nMore Information needed"
] | [
6,
13
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"cpsc2018\"\n\nMore Information needed"
] |
411e3b00f7fcdf43089360f11693bb03395860b7 | Rhino dataset before doing AI-guided deep cleaning. Contains 1,960,351 examples | M4-ai/Raw-Rhino | [
"task_categories:text-generation",
"task_categories:conversational",
"task_categories:question-answering",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-01-07T03:15:31+00:00 | {"language": ["en"], "license": "apache-2.0", "task_categories": ["text-generation", "conversational", "question-answering"]} | 2024-01-28T18:54:25+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #task_categories-conversational #task_categories-question-answering #language-English #license-apache-2.0 #region-us
| Rhino dataset before doing AI-guided deep cleaning. Contains 1,960,351 examples | [] | [
"TAGS\n#task_categories-text-generation #task_categories-conversational #task_categories-question-answering #language-English #license-apache-2.0 #region-us \n"
] | [
51
] | [
"passage: TAGS\n#task_categories-text-generation #task_categories-conversational #task_categories-question-answering #language-English #license-apache-2.0 #region-us \n"
] |
75c6606e3adbb289a304121cf37ddc4b2f3a20af |
# Federico García Lorca. Canciones, Poemas y Romances | xaviviro/FEDERICO-GARCIA-LORCA-canciones-poemas-romances | [
"size_categories:n<1K",
"language:es",
"license:apache-2.0",
"poesia",
"lorca",
"region:us"
] | 2024-01-07T03:23:49+00:00 | {"language": ["es"], "license": "apache-2.0", "size_categories": ["n<1K"], "pretty_name": "Federico Garc\u00eda Lorca. Canciones, Poemas y Romances", "tags": ["poesia", "lorca"]} | 2024-01-07T04:46:21+00:00 | [] | [
"es"
] | TAGS
#size_categories-n<1K #language-Spanish #license-apache-2.0 #poesia #lorca #region-us
|
# Federico García Lorca. Canciones, Poemas y Romances | [
"# Federico García Lorca. Canciones, Poemas y Romances"
] | [
"TAGS\n#size_categories-n<1K #language-Spanish #license-apache-2.0 #poesia #lorca #region-us \n",
"# Federico García Lorca. Canciones, Poemas y Romances"
] | [
35,
15
] | [
"passage: TAGS\n#size_categories-n<1K #language-Spanish #license-apache-2.0 #poesia #lorca #region-us \n# Federico García Lorca. Canciones, Poemas y Romances"
] |
642510f48b4a1e00f19180caf947abfb3dd188fe | # Dataset Card for "pretrain_repeat_paraphrase"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | xwjzds/pretrain_repeat_paraphrase | [
"region:us"
] | 2024-01-07T05:03:16+00:00 | {"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2428913416, "num_examples": 875085}], "download_size": 1508464761, "dataset_size": 2428913416}} | 2024-01-10T01:10:59+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "pretrain_repeat_paraphrase"
More Information needed | [
"# Dataset Card for \"pretrain_repeat_paraphrase\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"pretrain_repeat_paraphrase\"\n\nMore Information needed"
] | [
6,
20
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"pretrain_repeat_paraphrase\"\n\nMore Information needed"
] |
ade5148d804d88f46581698a1558834b1d864800 | # Dataset Card for "bioacoustic_mel_segments"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | adilhabibi/bioacoustic_mel_segments | [
"region:us"
] | 2024-01-07T05:03:31+00:00 | {"dataset_info": {"features": [{"name": "segments", "sequence": {"sequence": {"sequence": "float32"}}}, {"name": "label_idices", "dtype": "int64"}, {"name": "label_names", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 72803953, "num_examples": 1457}], "download_size": 53309954, "dataset_size": 72803953}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-07T05:07:08+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "bioacoustic_mel_segments"
More Information needed | [
"# Dataset Card for \"bioacoustic_mel_segments\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"bioacoustic_mel_segments\"\n\nMore Information needed"
] | [
6,
19
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"bioacoustic_mel_segments\"\n\nMore Information needed"
] |
1ca482b19b4635c852202d9ca87dce572f90ecb7 | # Dataset Card for "CVEs_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Nganlt/CVEs_100 | [
"region:us"
] | 2024-01-07T05:07:30+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 345807, "num_examples": 1500}], "download_size": 101396, "dataset_size": 345807}} | 2024-01-07T05:12:36+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "CVEs_100"
More Information needed | [
"# Dataset Card for \"CVEs_100\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"CVEs_100\"\n\nMore Information needed"
] | [
6,
14
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"CVEs_100\"\n\nMore Information needed"
] |
ed947e50f2f0f2ead636a8b885fd2c77456bb8c7 |
# IndicIRSuite: Multilingual Dataset and Neural Information Models for Indian Languages
Paper link: https://arxiv.org/abs/2312.09508
Dataset link: https://huggingface.co/datasets/saifulhaq9/indicmarco
Model link: https://huggingface.co/saifulhaq9/indiccolbert
## Contributors & Acknowledgements
Key Contributors and Team Members: Saiful Haq, Ashutosh Sharma, Pushpak Bhattacharyya
## Kindly cite our paper, If you are are using our datasets or models:
@article{haq2023indicirsuite,
title={IndicIRSuite: Multilingual Dataset and Neural Information Models for Indian Languages},
author={Haq, Saiful and Sharma, Ashutosh and Bhattacharyya, Pushpak},
journal={arXiv preprint arXiv:2312.09508},
year={2023}
}
## About
This repository contains query.train.tsv and collection.tsv files in 11 Indian Languages,
to train multilingual IR models.
## Language Code to Language Mapping
asm_Beng: Assamese Language
ben_Beng: Bengali Language
guj_Gujr: Gujarati Language
hin_Deva: Hindi Language
kan_Knda: Kannada Language
mal_Mlym: Malyalam Language
mar_Deva: Marathi Language
ory_Orya: Oriya Language
pan_Guru: Punjabi Language
tam_Taml: Tamil Language
tel_Telu: Telugu Language
| saifulhaq9/indicmarco | [
"license:mit",
"arxiv:2312.09508",
"region:us"
] | 2024-01-07T05:20:10+00:00 | {"license": "mit"} | 2024-01-16T04:41:01+00:00 | [
"2312.09508"
] | [] | TAGS
#license-mit #arxiv-2312.09508 #region-us
|
# IndicIRSuite: Multilingual Dataset and Neural Information Models for Indian Languages
Paper link: URL
Dataset link: URL
Model link: URL
## Contributors & Acknowledgements
Key Contributors and Team Members: Saiful Haq, Ashutosh Sharma, Pushpak Bhattacharyya
## Kindly cite our paper, If you are are using our datasets or models:
@article{haq2023indicirsuite,
title={IndicIRSuite: Multilingual Dataset and Neural Information Models for Indian Languages},
author={Haq, Saiful and Sharma, Ashutosh and Bhattacharyya, Pushpak},
journal={arXiv preprint arXiv:2312.09508},
year={2023}
}
## About
This repository contains URL and URL files in 11 Indian Languages,
to train multilingual IR models.
## Language Code to Language Mapping
asm_Beng: Assamese Language
ben_Beng: Bengali Language
guj_Gujr: Gujarati Language
hin_Deva: Hindi Language
kan_Knda: Kannada Language
mal_Mlym: Malyalam Language
mar_Deva: Marathi Language
ory_Orya: Oriya Language
pan_Guru: Punjabi Language
tam_Taml: Tamil Language
tel_Telu: Telugu Language
| [
"# IndicIRSuite: Multilingual Dataset and Neural Information Models for Indian Languages\n\nPaper link: URL\n\nDataset link: URL\n\nModel link: URL",
"## Contributors & Acknowledgements\n\nKey Contributors and Team Members: Saiful Haq, Ashutosh Sharma, Pushpak Bhattacharyya",
"## Kindly cite our paper, If you are are using our datasets or models:\n\n@article{haq2023indicirsuite,\n title={IndicIRSuite: Multilingual Dataset and Neural Information Models for Indian Languages},\n author={Haq, Saiful and Sharma, Ashutosh and Bhattacharyya, Pushpak},\n journal={arXiv preprint arXiv:2312.09508},\n year={2023}\n}",
"## About\n\nThis repository contains URL and URL files in 11 Indian Languages,\nto train multilingual IR models.",
"## Language Code to Language Mapping\n\nasm_Beng: Assamese Language\n\nben_Beng: Bengali Language\n\nguj_Gujr: Gujarati Language\n\nhin_Deva: Hindi Language\n\nkan_Knda: Kannada Language\n\nmal_Mlym: Malyalam Language\n\nmar_Deva: Marathi Language\n\nory_Orya: Oriya Language\n\npan_Guru: Punjabi Language\n\ntam_Taml: Tamil Language\n\ntel_Telu: Telugu Language"
] | [
"TAGS\n#license-mit #arxiv-2312.09508 #region-us \n",
"# IndicIRSuite: Multilingual Dataset and Neural Information Models for Indian Languages\n\nPaper link: URL\n\nDataset link: URL\n\nModel link: URL",
"## Contributors & Acknowledgements\n\nKey Contributors and Team Members: Saiful Haq, Ashutosh Sharma, Pushpak Bhattacharyya",
"## Kindly cite our paper, If you are are using our datasets or models:\n\n@article{haq2023indicirsuite,\n title={IndicIRSuite: Multilingual Dataset and Neural Information Models for Indian Languages},\n author={Haq, Saiful and Sharma, Ashutosh and Bhattacharyya, Pushpak},\n journal={arXiv preprint arXiv:2312.09508},\n year={2023}\n}",
"## About\n\nThis repository contains URL and URL files in 11 Indian Languages,\nto train multilingual IR models.",
"## Language Code to Language Mapping\n\nasm_Beng: Assamese Language\n\nben_Beng: Bengali Language\n\nguj_Gujr: Gujarati Language\n\nhin_Deva: Hindi Language\n\nkan_Knda: Kannada Language\n\nmal_Mlym: Malyalam Language\n\nmar_Deva: Marathi Language\n\nory_Orya: Oriya Language\n\npan_Guru: Punjabi Language\n\ntam_Taml: Tamil Language\n\ntel_Telu: Telugu Language"
] | [
21,
35,
33,
106,
26,
94
] | [
"passage: TAGS\n#license-mit #arxiv-2312.09508 #region-us \n# IndicIRSuite: Multilingual Dataset and Neural Information Models for Indian Languages\n\nPaper link: URL\n\nDataset link: URL\n\nModel link: URL## Contributors & Acknowledgements\n\nKey Contributors and Team Members: Saiful Haq, Ashutosh Sharma, Pushpak Bhattacharyya## Kindly cite our paper, If you are are using our datasets or models:\n\n@article{haq2023indicirsuite,\n title={IndicIRSuite: Multilingual Dataset and Neural Information Models for Indian Languages},\n author={Haq, Saiful and Sharma, Ashutosh and Bhattacharyya, Pushpak},\n journal={arXiv preprint arXiv:2312.09508},\n year={2023}\n}## About\n\nThis repository contains URL and URL files in 11 Indian Languages,\nto train multilingual IR models.## Language Code to Language Mapping\n\nasm_Beng: Assamese Language\n\nben_Beng: Bengali Language\n\nguj_Gujr: Gujarati Language\n\nhin_Deva: Hindi Language\n\nkan_Knda: Kannada Language\n\nmal_Mlym: Malyalam Language\n\nmar_Deva: Marathi Language\n\nory_Orya: Oriya Language\n\npan_Guru: Punjabi Language\n\ntam_Taml: Tamil Language\n\ntel_Telu: Telugu Language"
] |
334bbb1f92817fd005c29044a1cb595275abdc97 | # Dataset Card for "Therapydataset_formatted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Ayush2312/Therapydataset_formatted | [
"region:us"
] | 2024-01-07T05:20:57+00:00 | {"dataset_info": {"features": [{"name": "train", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 407954044, "num_examples": 99086}], "download_size": 205585014, "dataset_size": 407954044}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-07T05:21:42+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Therapydataset_formatted"
More Information needed | [
"# Dataset Card for \"Therapydataset_formatted\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Therapydataset_formatted\"\n\nMore Information needed"
] | [
6,
18
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"Therapydataset_formatted\"\n\nMore Information needed"
] |
8794b62dd8e433dc8a77d11c81f21af0b0ad3c8f | # AIGCBench v1.0
AIGCBench is a novel and comprehensive benchmark designed for evaluating the capabilities of state-of-the-art video generation algorithms. Official dataset for the paper:**AIGCBench: Comprehensive Evaluation of Image-to-Video Content Generated by AI**, ***BenchCouncil Transactions on Benchmarks, Standards and Evaluations (TBench)***.
<a href='https://www.benchcouncil.org/AIGCBench/'><img src='https://img.shields.io/badge/Project-Website-orange'></a>
## Description
This dataset is intended for the evaluation of video generation tasks. Our dataset includes image-text pairs and video-text pairs. The dataset comprises three parts:
1. `ours` - A custom generation of image-text samples.
2. `webvid val` - A subset of 1000 video samples from the WebVid val dataset.
3. `laion-aesthetics` - A subset of LAION dataset that includes 925 curated image-text samples.
## Data Organization
The dataset is organized into the following folders and files:
- `t2i_aspect_ratio_625.zip` - Contains images paired with text, adjusted to an aspect ratio of 0.625.
- `webvid_eval_1000.txt` - Contains video names for 1000 selected video samples. Considering that the first frame of the video may not contain the main information or might be a bad case, we use the tenth frame of the video as the initial frame.
- `Laion-aesthetics_select_samples.txt` - Contains metadata and annotations for 925 image-text samples.
## Acknowledgments
We would like to thank all contributors and organizations behind the data sources, especially the maintainers of WebVid and LAION datasets.
## Contact Information
[email protected] and [email protected]
## Citation
If you find our work useful in your research, please consider citing our paper:
```bibtex
@misc{fan2024aigcbench,
title={AIGCBench: Comprehensive Evaluation of Image-to-Video Content Generated by AI},
author={Fanda Fan and Chunjie Luo and Wanling Gao and Jianfeng Zhan},
year={2024},
eprint={2401.01651},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
``` | stevenfan/AIGCBench_v1.0 | [
"size_categories:1K<n<10K",
"license:apache-2.0",
"arxiv:2401.01651",
"region:us"
] | 2024-01-07T05:47:09+00:00 | {"license": "apache-2.0", "size_categories": ["1K<n<10K"]} | 2024-01-24T08:12:28+00:00 | [
"2401.01651"
] | [] | TAGS
#size_categories-1K<n<10K #license-apache-2.0 #arxiv-2401.01651 #region-us
| # AIGCBench v1.0
AIGCBench is a novel and comprehensive benchmark designed for evaluating the capabilities of state-of-the-art video generation algorithms. Official dataset for the paper:AIGCBench: Comprehensive Evaluation of Image-to-Video Content Generated by AI, *BenchCouncil Transactions on Benchmarks, Standards and Evaluations (TBench)*.
<a href='URL src='URL
## Description
This dataset is intended for the evaluation of video generation tasks. Our dataset includes image-text pairs and video-text pairs. The dataset comprises three parts:
1. 'ours' - A custom generation of image-text samples.
2. 'webvid val' - A subset of 1000 video samples from the WebVid val dataset.
3. 'laion-aesthetics' - A subset of LAION dataset that includes 925 curated image-text samples.
## Data Organization
The dataset is organized into the following folders and files:
- 't2i_aspect_ratio_625.zip' - Contains images paired with text, adjusted to an aspect ratio of 0.625.
- 'webvid_eval_1000.txt' - Contains video names for 1000 selected video samples. Considering that the first frame of the video may not contain the main information or might be a bad case, we use the tenth frame of the video as the initial frame.
- 'Laion-aesthetics_select_samples.txt' - Contains metadata and annotations for 925 image-text samples.
## Acknowledgments
We would like to thank all contributors and organizations behind the data sources, especially the maintainers of WebVid and LAION datasets.
## Contact Information
fanfanda@URL and jianfengzhan.benchcouncil@URL
If you find our work useful in your research, please consider citing our paper:
| [
"# AIGCBench v1.0\n\nAIGCBench is a novel and comprehensive benchmark designed for evaluating the capabilities of state-of-the-art video generation algorithms. Official dataset for the paper:AIGCBench: Comprehensive Evaluation of Image-to-Video Content Generated by AI, *BenchCouncil Transactions on Benchmarks, Standards and Evaluations (TBench)*.\n<a href='URL src='URL",
"## Description\n\nThis dataset is intended for the evaluation of video generation tasks. Our dataset includes image-text pairs and video-text pairs. The dataset comprises three parts:\n\n1. 'ours' - A custom generation of image-text samples.\n2. 'webvid val' - A subset of 1000 video samples from the WebVid val dataset.\n3. 'laion-aesthetics' - A subset of LAION dataset that includes 925 curated image-text samples.",
"## Data Organization\n\nThe dataset is organized into the following folders and files:\n\n- 't2i_aspect_ratio_625.zip' - Contains images paired with text, adjusted to an aspect ratio of 0.625.\n- 'webvid_eval_1000.txt' - Contains video names for 1000 selected video samples. Considering that the first frame of the video may not contain the main information or might be a bad case, we use the tenth frame of the video as the initial frame.\n- 'Laion-aesthetics_select_samples.txt' - Contains metadata and annotations for 925 image-text samples.",
"## Acknowledgments\nWe would like to thank all contributors and organizations behind the data sources, especially the maintainers of WebVid and LAION datasets.",
"## Contact Information\nfanfanda@URL and jianfengzhan.benchcouncil@URL\n\nIf you find our work useful in your research, please consider citing our paper:"
] | [
"TAGS\n#size_categories-1K<n<10K #license-apache-2.0 #arxiv-2401.01651 #region-us \n",
"# AIGCBench v1.0\n\nAIGCBench is a novel and comprehensive benchmark designed for evaluating the capabilities of state-of-the-art video generation algorithms. Official dataset for the paper:AIGCBench: Comprehensive Evaluation of Image-to-Video Content Generated by AI, *BenchCouncil Transactions on Benchmarks, Standards and Evaluations (TBench)*.\n<a href='URL src='URL",
"## Description\n\nThis dataset is intended for the evaluation of video generation tasks. Our dataset includes image-text pairs and video-text pairs. The dataset comprises three parts:\n\n1. 'ours' - A custom generation of image-text samples.\n2. 'webvid val' - A subset of 1000 video samples from the WebVid val dataset.\n3. 'laion-aesthetics' - A subset of LAION dataset that includes 925 curated image-text samples.",
"## Data Organization\n\nThe dataset is organized into the following folders and files:\n\n- 't2i_aspect_ratio_625.zip' - Contains images paired with text, adjusted to an aspect ratio of 0.625.\n- 'webvid_eval_1000.txt' - Contains video names for 1000 selected video samples. Considering that the first frame of the video may not contain the main information or might be a bad case, we use the tenth frame of the video as the initial frame.\n- 'Laion-aesthetics_select_samples.txt' - Contains metadata and annotations for 925 image-text samples.",
"## Acknowledgments\nWe would like to thank all contributors and organizations behind the data sources, especially the maintainers of WebVid and LAION datasets.",
"## Contact Information\nfanfanda@URL and jianfengzhan.benchcouncil@URL\n\nIf you find our work useful in your research, please consider citing our paper:"
] | [
35,
104,
110,
148,
37,
40
] | [
"passage: TAGS\n#size_categories-1K<n<10K #license-apache-2.0 #arxiv-2401.01651 #region-us \n# AIGCBench v1.0\n\nAIGCBench is a novel and comprehensive benchmark designed for evaluating the capabilities of state-of-the-art video generation algorithms. Official dataset for the paper:AIGCBench: Comprehensive Evaluation of Image-to-Video Content Generated by AI, *BenchCouncil Transactions on Benchmarks, Standards and Evaluations (TBench)*.\n<a href='URL src='URL## Description\n\nThis dataset is intended for the evaluation of video generation tasks. Our dataset includes image-text pairs and video-text pairs. The dataset comprises three parts:\n\n1. 'ours' - A custom generation of image-text samples.\n2. 'webvid val' - A subset of 1000 video samples from the WebVid val dataset.\n3. 'laion-aesthetics' - A subset of LAION dataset that includes 925 curated image-text samples.## Data Organization\n\nThe dataset is organized into the following folders and files:\n\n- 't2i_aspect_ratio_625.zip' - Contains images paired with text, adjusted to an aspect ratio of 0.625.\n- 'webvid_eval_1000.txt' - Contains video names for 1000 selected video samples. Considering that the first frame of the video may not contain the main information or might be a bad case, we use the tenth frame of the video as the initial frame.\n- 'Laion-aesthetics_select_samples.txt' - Contains metadata and annotations for 925 image-text samples.## Acknowledgments\nWe would like to thank all contributors and organizations behind the data sources, especially the maintainers of WebVid and LAION datasets.## Contact Information\nfanfanda@URL and jianfengzhan.benchcouncil@URL\n\nIf you find our work useful in your research, please consider citing our paper:"
] |
6363f1f0b7726ff1cdd6529562502a76f1a559aa |
This is just a concatenation of several other data sets mostly converted to Alpaca style prompts to help give a good logic data set to settle some models or fine tune. | ibivibiv/variety-logic-training | [
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-01-07T05:48:16+00:00 | {"language": ["en"], "license": "apache-2.0", "pretty_name": "A Variety of Combined Logic Data", "dataset_info": {"features": [{"name": "INSTRUCTION", "dtype": "string"}, {"name": "RESPONSE", "dtype": "string"}, {"name": "SOURCE", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 117622037, "num_examples": 110214}], "download_size": 24688336, "dataset_size": 117622037}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-07T05:50:27+00:00 | [] | [
"en"
] | TAGS
#language-English #license-apache-2.0 #region-us
|
This is just a concatenation of several other data sets mostly converted to Alpaca style prompts to help give a good logic data set to settle some models or fine tune. | [] | [
"TAGS\n#language-English #license-apache-2.0 #region-us \n"
] | [
18
] | [
"passage: TAGS\n#language-English #license-apache-2.0 #region-us \n"
] |
a568782fc96566103282dbf978c24c9ebf97ca4b | # Dataset Card for "asqa_origin"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | NomaDamas/asqa_origin | [
"region:us"
] | 2024-01-07T05:54:03+00:00 | {"dataset_info": [{"config_name": "dev", "features": [{"name": "ambiguous_question", "dtype": "string"}, {"name": "qa_pairs", "list": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "short_answers", "sequence": "string"}, {"name": "wikipage", "dtype": "string"}]}, {"name": "wikipages", "list": [{"name": "title", "dtype": "string"}, {"name": "url", "dtype": "string"}]}, {"name": "annotations", "list": [{"name": "knowledge", "list": [{"name": "content", "dtype": "string"}, {"name": "wikipage", "dtype": "string"}]}, {"name": "long_answer", "dtype": "string"}]}, {"name": "__index_level_0__", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 2986266, "num_examples": 948}], "download_size": 1460867, "dataset_size": 2986266}, {"config_name": "train", "features": [{"name": "ambiguous_question", "dtype": "string"}, {"name": "qa_pairs", "list": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "short_answers", "sequence": "string"}, {"name": "wikipage", "dtype": "string"}]}, {"name": "wikipages", "list": [{"name": "title", "dtype": "string"}, {"name": "url", "dtype": "string"}]}, {"name": "annotations", "list": [{"name": "knowledge", "list": [{"name": "content", "dtype": "string"}, {"name": "wikipage", "dtype": "string"}]}, {"name": "long_answer", "dtype": "string"}]}, {"name": "__index_level_0__", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 9765983, "num_examples": 4353}], "download_size": 5336235, "dataset_size": 9765983}], "configs": [{"config_name": "dev", "data_files": [{"split": "validation", "path": "dev/validation-*"}]}, {"config_name": "train", "data_files": [{"split": "train", "path": "train/train-*"}]}]} | 2024-01-07T05:55:29+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "asqa_origin"
More Information needed | [
"# Dataset Card for \"asqa_origin\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"asqa_origin\"\n\nMore Information needed"
] | [
6,
15
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"asqa_origin\"\n\nMore Information needed"
] |
5e937f76cc36c82e966c417b5b0ea2f904841517 |
# Dataset Card for GEM/viggo
## Dataset Description
- **Homepage:** https://nlds.soe.ucsc.edu/viggo
- **Repository:** [Needs More Information]
- **Paper:** https://aclanthology.org/W19-8623/
- **Leaderboard:** N/A
- **Point of Contact:** Juraj Juraska
### Link to Main Data Card
You can find the main data card on the [GEM Website](https://gem-benchmark.com/data_cards/viggo).
### Dataset Summary
ViGGO is an English data-to-text generation dataset in the video game domain, with target responses being more conversational than information-seeking, yet constrained to the information presented in a meaning representation. The dataset is relatively small with about 5,000 datasets but very clean, and can thus serve for evaluating transfer learning, low-resource, or few-shot capabilities of neural models.
You can load the dataset via:
```
import datasets
data = datasets.load_dataset('GEM/viggo')
```
The data loader can be found [here](https://huggingface.co/datasets/GEM/viggo).
#### website
[Wesbite](https://nlds.soe.ucsc.edu/viggo)
#### paper
[ACL Anthology](https://aclanthology.org/W19-8623/)
#### authors
Juraj Juraska, Kevin K. Bowden, Marilyn Walker
## Dataset Overview
### Where to find the Data and its Documentation
#### Webpage
<!-- info: What is the webpage for the dataset (if it exists)? -->
<!-- scope: telescope -->
[Wesbite](https://nlds.soe.ucsc.edu/viggo)
#### Paper
<!-- info: What is the link to the paper describing the dataset (open access preferred)? -->
<!-- scope: telescope -->
[ACL Anthology](https://aclanthology.org/W19-8623/)
#### BibTex
<!-- info: Provide the BibTex-formatted reference for the dataset. Please use the correct published version (ACL anthology, etc.) instead of google scholar created Bibtex. -->
<!-- scope: microscope -->
```
@inproceedings{juraska-etal-2019-viggo,
title = "{V}i{GGO}: A Video Game Corpus for Data-To-Text Generation in Open-Domain Conversation",
author = "Juraska, Juraj and
Bowden, Kevin and
Walker, Marilyn",
booktitle = "Proceedings of the 12th International Conference on Natural Language Generation",
month = oct # "{--}" # nov,
year = "2019",
address = "Tokyo, Japan",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/W19-8623",
doi = "10.18653/v1/W19-8623",
pages = "164--172",
}
```
#### Contact Name
<!-- quick -->
<!-- info: If known, provide the name of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
Juraj Juraska
#### Contact Email
<!-- info: If known, provide the email of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
[email protected]
#### Has a Leaderboard?
<!-- info: Does the dataset have an active leaderboard? -->
<!-- scope: telescope -->
no
### Languages and Intended Use
#### Multilingual?
<!-- quick -->
<!-- info: Is the dataset multilingual? -->
<!-- scope: telescope -->
no
#### Covered Languages
<!-- quick -->
<!-- info: What languages/dialects are covered in the dataset? -->
<!-- scope: telescope -->
`English`
#### License
<!-- quick -->
<!-- info: What is the license of the dataset? -->
<!-- scope: telescope -->
cc-by-sa-4.0: Creative Commons Attribution Share Alike 4.0 International
#### Intended Use
<!-- info: What is the intended use of the dataset? -->
<!-- scope: microscope -->
ViGGO was designed for the task of data-to-text generation in chatbots (as opposed to task-oriented dialogue systems), with target responses being more conversational than information-seeking, yet constrained to the information presented in a meaning representation. The dataset, being relatively small and clean, can also serve for demonstrating transfer learning capabilities of neural models.
#### Primary Task
<!-- info: What primary task does the dataset support? -->
<!-- scope: telescope -->
Data-to-Text
### Credit
#### Curation Organization Type(s)
<!-- info: In what kind of organization did the dataset curation happen? -->
<!-- scope: telescope -->
`academic`
#### Curation Organization(s)
<!-- info: Name the organization(s). -->
<!-- scope: periscope -->
University of California, Santa Cruz
#### Dataset Creators
<!-- info: Who created the original dataset? List the people involved in collecting the dataset and their affiliation(s). -->
<!-- scope: microscope -->
Juraj Juraska, Kevin K. Bowden, Marilyn Walker
#### Who added the Dataset to GEM?
<!-- info: Who contributed to the data card and adding the dataset to GEM? List the people+affiliations involved in creating this data card and who helped integrate this dataset into GEM. -->
<!-- scope: microscope -->
Juraj Juraska
### Dataset Structure
#### Data Fields
<!-- info: List and describe the fields present in the dataset. -->
<!-- scope: telescope -->
Each example in the dataset has the following two fields:
- `mr`: A meaning representation (MR) that, in a structured format, provides the information to convey, as well as the desired dialogue act (DA) type.
- `ref`: A reference output, i.e., a corresponding utterance realizing all the information in the MR.
Each MR is a flattened dictionary of attribute-and-value pairs, "wrapped" in the dialogue act type indication. This format was chosen primarily for its compactness, but also to allow for easy concatenation of multiple DAs (each with potentially different attributes) in a single MR.
Following is the list of all possible attributes (which are also refered to as "slots") in ViGGO along with their types/possible values:
- `name`: The name of a video game (e.g., Rise of the Tomb Raider).
- `release_year`: The year a video game was released in (e.g., 2015).
- `exp_release_date`: For a not-yet-released game, the date when it is expected to be released (e.g., February 22, 2019). *Note: This slot cannot appear together with `release_year` in the same dialogue act.*
- `developer`: The name of the studio/person that created the game (e.g., Crystal Dynamics).
- `genres`: A list of one or more genre labels from a set of possible values (e.g., action-adventure, shooter).
- `player_perspective`: A list of one or more perspectives from which the game is/can be played (possible values: first person, third person, side view, bird view).
- `platforms`: A list of one or more gaming platforms the game was officially released for (possible values: PC, PlayStation, Xbox, Nintendo, Nintendo Switch).
- `esrb`: A game's content rating as determined by the ESRB (possible values: E (for Everyone), E 10+ (for Everyone 10 and Older), T (for Teen), M (for Mature)).
- `rating`: Depending on the dialogue act this slot is used with, it is a categorical representation of either the game's average rating or the game's liking (possible values: excellent, good, average, poor).
- `has_multiplayer`: Indicates whether a game supports multiplayer or can only be played in single-player mode (possible values: yes, no).
- `available_on_steam`: Indicates whether a game can be purchased through the Steam digital distribution service (possible values: yes, no).
- `has_linux_release`: Indicates whether a game is supported on Linux operating systems (possible values: yes, no).
- `has_mac_release`: Indicates whether a game is supported on macOS (possible values: yes, no).
- `specifier`: A game specifier used by the `request` DA, typically an adjective (e.g., addictive, easiest, overrated, visually impressive).
Each MR in the dataset has 3 distinct reference utterances, which are represented as 3 separate examples with the same MR.
#### Reason for Structure
<!-- info: How was the dataset structure determined? -->
<!-- scope: microscope -->
The dataset structure mostly follows the format of the popular E2E dataset, however, with added dialogue act type indications, new list-type attributes introduced, and unified naming convention for multi-word attribute names.
#### Example Instance
<!-- info: Provide a JSON formatted example of a typical instance in the dataset. -->
<!-- scope: periscope -->
```
{
"mr": "give_opinion(name[SpellForce 3], rating[poor], genres[real-time strategy, role-playing], player_perspective[bird view])",
"ref": "I think that SpellForce 3 is one of the worst games I've ever played. Trying to combine the real-time strategy and role-playing genres just doesn't work, and the bird view perspective makes it near impossible to play."
}
```
#### Data Splits
<!-- info: Describe and name the splits in the dataset if there are more than one. -->
<!-- scope: periscope -->
ViGGO is split into 3 partitions, with no MRs in common between the training set and either of the validation and the test set (and that *after* delexicalizing the `name` and `developer` slots). The ratio of examples in the partitions is approximately 7.5 : 1 : 1.5, with their exact sizes listed below:
- **Train:** 5,103 (1,675 unique MRs)
- **Validation:** 714 (238 unique MRs)
- **Test:** 1,083 (359 unique MRs)
- **TOTAL:** 6,900 (2,253 unique MRs)
*Note: The reason why the number of unique MRs is not exactly one third of all examples is that for each `request_attribute` DA (which only has one slot, and that without a value) 12 reference utterances were collected instead of 3.*
#### Splitting Criteria
<!-- info: Describe any criteria for splitting the data, if used. If there are differences between the splits (e.g., if the training annotations are machine-generated and the dev and test ones are created by humans, or if different numbers of annotators contributed to each example), describe them here. -->
<!-- scope: microscope -->
A similar MR length and slot distribution was preserved across the partitions. The distribution of DA types, on the other hand, is skewed slightly toward fewer `inform` DA instances (the most prevalent DA type) and a higher proportion of the less prevalent DAs in the validation and the test set.
####
<!-- info: What does an outlier of the dataset in terms of length/perplexity/embedding look like? -->
<!-- scope: microscope -->
```
{
"mr": "request_attribute(player_perspective[])",
"ref": "Is there a certain player perspective that you prefer over others in games you play?"
},
{
"mr": "inform(name[FIFA 12], esrb[E (for Everyone)], genres[simulation, sport], player_perspective[bird view, side view], platforms[PlayStation, Xbox, Nintendo, PC], available_on_steam[no])",
"ref": "Fifa 12 is a decent sports simulator. It's pretty cool how the game swaps from the bird's eye perspective down to a side view while you're playing. You can get the game for PlayStation, Xbox, Nintendo consoles, and PC, but unfortunately it's not on Steam. Of course, as a sports game there's not much objectionable content so it's rated E."
},
{
"mr": "inform(name[Super Bomberman], release_year[1993], genres[action, strategy], has_multiplayer[no], platforms[Nintendo, PC], available_on_steam[no], has_linux_release[no], has_mac_release[no])",
"ref": "Super Bomberman is one of my favorite Nintendo games, also available on PC, though not through Steam. It came out all the way back in 1993, and you can't get it for any modern consoles, unfortunately, so no online multiplayer, or of course Linux or Mac releases either. That said, it's still one of the most addicting action-strategy games out there."
}
```
## Dataset in GEM
### Rationale for Inclusion in GEM
#### Why is the Dataset in GEM?
<!-- info: What does this dataset contribute toward better generation evaluation and why is it part of GEM? -->
<!-- scope: microscope -->
ViGGO is a fairly small dataset but includes a greater variety of utterance types than most other datasets for NLG from structured meaning representations. This makes it more interesting from the perspective of model evaluation, since models have to learn to differentiate between various dialogue act types that share the same slots.
#### Similar Datasets
<!-- info: Do other datasets for the high level task exist? -->
<!-- scope: telescope -->
yes
#### Unique Language Coverage
<!-- info: Does this dataset cover other languages than other datasets for the same task? -->
<!-- scope: periscope -->
no
#### Difference from other GEM datasets
<!-- info: What else sets this dataset apart from other similar datasets in GEM? -->
<!-- scope: microscope -->
ViGGO's language is more casual and conversational -- as opposed to information-seeking -- which differentiates it from the majority of popular datasets for the same type of data-to-text task. Moreover, the video game domain is a rather uncommon one in the NLG community, despite being very well-suited for data-to-text generation, considering it offers entities with many attributes to talk about, which can be described in a structured format.
### GEM-Specific Curation
#### Modificatied for GEM?
<!-- info: Has the GEM version of the dataset been modified in any way (data, processing, splits) from the original curated data? -->
<!-- scope: telescope -->
no
#### Additional Splits?
<!-- info: Does GEM provide additional splits to the dataset? -->
<!-- scope: telescope -->
no
### Getting Started with the Task
#### Pointers to Resources
<!-- info: Getting started with in-depth research on the task. Add relevant pointers to resources that researchers can consult when they want to get started digging deeper into the task. -->
<!-- scope: microscope -->
- [E2E NLG Challenge](http://www.macs.hw.ac.uk/InteractionLab/E2E/)
#### Technical Terms
<!-- info: Technical terms used in this card and the dataset and their definitions -->
<!-- scope: microscope -->
- MR = meaning representation
- DA = dialogue act
## Previous Results
### Previous Results
#### Metrics
<!-- info: What metrics are typically used for this task? -->
<!-- scope: periscope -->
`BLEU`, `METEOR`, `ROUGE`, `BERT-Score`, `BLEURT`, `Other: Other Metrics`
#### Other Metrics
<!-- info: Definitions of other metrics -->
<!-- scope: periscope -->
SER (slot error rate): Indicates the proportion of missing/incorrect/duplicate/hallucinated slot mentions in the utterances across a test set. The closer to zero a model scores in this metric, the more semantically accurate its outputs are. This metric is typically calculated either manually on a small sample of generated outputs, or heuristically using domain-specific regex rules and gazetteers.
#### Previous results available?
<!-- info: Are previous results available? -->
<!-- scope: telescope -->
yes
#### Relevant Previous Results
<!-- info: What are the most relevant previous results for this task/dataset? -->
<!-- scope: microscope -->
- [Juraska et al., 2019. ViGGO: A Video Game Corpus for Data-To-Text Generation in Open-Domain Conversation.](https://aclanthology.org/W19-8623/)
- [Harkous et al., 2020. Have Your Text and Use It Too! End-to-End Neural Data-to-Text Generation with Semantic Fidelity.](https://aclanthology.org/2020.coling-main.218/)
- [Kedzie and McKeown, 2020. Controllable Meaning Representation to Text Generation: Linearization and Data Augmentation Strategies.](https://aclanthology.org/2020.emnlp-main.419/)
- [Juraska and Walker, 2021. Attention Is Indeed All You Need: Semantically Attention-Guided Decoding for Data-to-Text NLG.](https://aclanthology.org/2021.inlg-1.45/)
## Dataset Curation
### Original Curation
#### Original Curation Rationale
<!-- info: Original curation rationale -->
<!-- scope: telescope -->
The primary motivation behind ViGGO was to create a data-to-text corpus in a new but conversational domain, and intended for use in open-domain chatbots rather than task-oriented dialogue systems. To this end, the dataset contains utterances of 9 generalizable and conversational dialogue act types, revolving around various aspects of video games. The idea is that similar, relatively small datasets could fairly easily be collected for other conversational domains -- especially other entertainment domains (such as music or books), but perhaps also topics like animals or food -- to support an open-domain conversational agent with controllable neural NLG.
Another desired quality of the ViGGO dataset was cleanliness (no typos and grammatical errors) and semantic accuracy, which has often not been the case with other crowdsourced data-to-text corpora. In general, for the data-to-text generation task, there is arguably no need to put the burden on the generation model to figure out the noise, since the noise would not be expected to be there in a real-world system whose dialogue manager that creates the input for the NLG module is usually configurable and tightly controlled.
#### Communicative Goal
<!-- info: What was the communicative goal? -->
<!-- scope: periscope -->
Produce a response from a structured meaning representation in the context of a conversation about video games. It can be a brief opinion or a description of a game, as well as a request for attribute (e.g., genre, player perspective, or platform) preference/confirmation or an inquiry about liking a particular type of games.
#### Sourced from Different Sources
<!-- info: Is the dataset aggregated from different data sources? -->
<!-- scope: telescope -->
no
### Language Data
#### How was Language Data Obtained?
<!-- info: How was the language data obtained? -->
<!-- scope: telescope -->
`Crowdsourced`
#### Where was it crowdsourced?
<!-- info: If crowdsourced, where from? -->
<!-- scope: periscope -->
`Amazon Mechanical Turk`
#### Language Producers
<!-- info: What further information do we have on the language producers? -->
<!-- scope: microscope -->
The paid crowdworkers who produced the reference utterances were from English-speaking countries, and they had at least 1,000 HITs approved and a HIT approval rate of 98% or more. Furthermore, in the instructions, crowdworkers were discouraged from taking on the task unless they considered themselves a gamer.
#### Topics Covered
<!-- info: Does the language in the dataset focus on specific topics? How would you describe them? -->
<!-- scope: periscope -->
The dataset focuses on video games and their various aspects, and hence the language of the utterances may contain video game-specific jargon.
#### Data Validation
<!-- info: Was the text validated by a different worker or a data curator? -->
<!-- scope: telescope -->
validated by data curator
#### Data Preprocessing
<!-- info: How was the text data pre-processed? (Enter N/A if the text was not pre-processed) -->
<!-- scope: microscope -->
First, regular expressions were used to enforce several standardization policies regarding special characters, punctuation, and the correction of undesired abbreviations/misspellings of standard domain-specific terms (e.g., terms like "Play station" or "PS4" would be changed to the uniform "PlayStation"). At the same time, hyphens were removed or enforced uniformly in certain terms, for example, "single-player". Although phrases such as "first person" should correctly have a hyphen when used as adjective, the crowdworkers used this rule very inconsistently. In order to avoid model outputs being penalized during the evaluation by the arbitrary choice of a hyphen presence or absence in the reference utterances, the hyphen was removed in all such phrases regardless of the noun vs. adjective use.
Second, an extensive set of heuristics was developed to identify slot-related errors. This process revealed the vast majority of missing or incorrect slot mentions, which were subsequently fixed according to the corresponding MRs. This eventually led to the development of a robust, cross-domain, heuristic slot aligner that can be used for automatic slot error rate evaluation. For details, see the appendix in [Juraska and Walker, 2021](https://aclanthology.org/2021.inlg-1.45/).
Crowdworkers would sometimes also inject a piece of information which was not present in the MR, some of which is not even represented by any of the slots, e.g., plot or main characters. This unsolicited information was removed from the utterances so as to avoid confusing the neural model. Finally, any remaining typos and grammatical errors were resolved.
#### Was Data Filtered?
<!-- info: Were text instances selected or filtered? -->
<!-- scope: telescope -->
manually
#### Filter Criteria
<!-- info: What were the selection criteria? -->
<!-- scope: microscope -->
Compliance with the indicated dialogue act type, semantic accuracy (i.e., all information in the corresponding MR mentioned and that correctly), and minimal extraneous information (e.g., personal experience/opinion). Whenever it was within a reasonable amount of effort, the utterances were manually fixed instead of being discarded/crowdsourced anew.
### Structured Annotations
#### Additional Annotations?
<!-- quick -->
<!-- info: Does the dataset have additional annotations for each instance? -->
<!-- scope: telescope -->
none
#### Annotation Service?
<!-- info: Was an annotation service used? -->
<!-- scope: telescope -->
no
### Consent
#### Any Consent Policy?
<!-- info: Was there a consent policy involved when gathering the data? -->
<!-- scope: telescope -->
no
### Private Identifying Information (PII)
#### Contains PII?
<!-- quick -->
<!-- info: Does the source language data likely contain Personal Identifying Information about the data creators or subjects? -->
<!-- scope: telescope -->
no PII
#### Justification for no PII
<!-- info: Provide a justification for selecting `no PII` above. -->
<!-- scope: periscope -->
Crowdworkers were instructed to only express the information in the provided meaning representation, which never prompted them to mention anything about themselves. Occasionally, they would still include a bit of personal experience (e.g., "I used to like the game as a kid.") or opinion, but these would be too general to be considered PII.
### Maintenance
#### Any Maintenance Plan?
<!-- info: Does the original dataset have a maintenance plan? -->
<!-- scope: telescope -->
no
## Broader Social Context
### Previous Work on the Social Impact of the Dataset
#### Usage of Models based on the Data
<!-- info: Are you aware of cases where models trained on the task featured in this dataset ore related tasks have been used in automated systems? -->
<!-- scope: telescope -->
no
### Impact on Under-Served Communities
#### Addresses needs of underserved Communities?
<!-- info: Does this dataset address the needs of communities that are traditionally underserved in language technology, and particularly language generation technology? Communities may be underserved for exemple because their language, language variety, or social or geographical context is underepresented in NLP and NLG resources (datasets and models). -->
<!-- scope: telescope -->
no
### Discussion of Biases
#### Any Documented Social Biases?
<!-- info: Are there documented social biases in the dataset? Biases in this context are variations in the ways members of different social categories are represented that can have harmful downstream consequences for members of the more disadvantaged group. -->
<!-- scope: telescope -->
no
## Considerations for Using the Data
### PII Risks and Liability
### Licenses
### Known Technical Limitations
#### Technical Limitations
<!-- info: Describe any known technical limitations, such as spurrious correlations, train/test overlap, annotation biases, or mis-annotations, and cite the works that first identified these limitations when possible. -->
<!-- scope: microscope -->
The dataset is limited to a single domain: video games. One caveat of using a language generator trained on this dataset in a dialogue system as-is is that multiple subsequent turns discussing the same video game would be repeating its full name. ViGGO was designed for generation without context, and therefore it is up to the dialogue manager to ensure that pronouns are substituted for the names whenever it would sound more natural in a dialogue. Alternately, the dataset can easily be augmented with automatically constructed samples which omit the `name` slot in the MR and replace the name with a pronoun in the reference utterance.
| AlexFromSynlabs/sllm | [
"task_categories:table-to-text",
"annotations_creators:none",
"language_creators:unknown",
"multilinguality:unknown",
"size_categories:unknown",
"source_datasets:original",
"language:en",
"license:cc-by-sa-4.0",
"data-to-text",
"region:us"
] | 2024-01-07T05:57:26+00:00 | {"annotations_creators": ["none"], "language_creators": ["unknown"], "language": ["en"], "license": ["cc-by-sa-4.0"], "multilinguality": ["unknown"], "size_categories": ["unknown"], "source_datasets": ["original"], "task_categories": ["table-to-text"], "task_ids": [], "pretty_name": "viggo", "tags": ["data-to-text"]} | 2024-01-08T13:35:54+00:00 | [] | [
"en"
] | TAGS
#task_categories-table-to-text #annotations_creators-none #language_creators-unknown #multilinguality-unknown #size_categories-unknown #source_datasets-original #language-English #license-cc-by-sa-4.0 #data-to-text #region-us
|
# Dataset Card for GEM/viggo
## Dataset Description
- Homepage: URL
- Repository:
- Paper: URL
- Leaderboard: N/A
- Point of Contact: Juraj Juraska
### Link to Main Data Card
You can find the main data card on the GEM Website.
### Dataset Summary
ViGGO is an English data-to-text generation dataset in the video game domain, with target responses being more conversational than information-seeking, yet constrained to the information presented in a meaning representation. The dataset is relatively small with about 5,000 datasets but very clean, and can thus serve for evaluating transfer learning, low-resource, or few-shot capabilities of neural models.
You can load the dataset via:
The data loader can be found here.
#### website
Wesbite
#### paper
ACL Anthology
#### authors
Juraj Juraska, Kevin K. Bowden, Marilyn Walker
## Dataset Overview
### Where to find the Data and its Documentation
#### Webpage
Wesbite
#### Paper
ACL Anthology
#### BibTex
#### Contact Name
Juraj Juraska
#### Contact Email
jjuraska@URL
#### Has a Leaderboard?
no
### Languages and Intended Use
#### Multilingual?
no
#### Covered Languages
'English'
#### License
cc-by-sa-4.0: Creative Commons Attribution Share Alike 4.0 International
#### Intended Use
ViGGO was designed for the task of data-to-text generation in chatbots (as opposed to task-oriented dialogue systems), with target responses being more conversational than information-seeking, yet constrained to the information presented in a meaning representation. The dataset, being relatively small and clean, can also serve for demonstrating transfer learning capabilities of neural models.
#### Primary Task
Data-to-Text
### Credit
#### Curation Organization Type(s)
'academic'
#### Curation Organization(s)
University of California, Santa Cruz
#### Dataset Creators
Juraj Juraska, Kevin K. Bowden, Marilyn Walker
#### Who added the Dataset to GEM?
Juraj Juraska
### Dataset Structure
#### Data Fields
Each example in the dataset has the following two fields:
- 'mr': A meaning representation (MR) that, in a structured format, provides the information to convey, as well as the desired dialogue act (DA) type.
- 'ref': A reference output, i.e., a corresponding utterance realizing all the information in the MR.
Each MR is a flattened dictionary of attribute-and-value pairs, "wrapped" in the dialogue act type indication. This format was chosen primarily for its compactness, but also to allow for easy concatenation of multiple DAs (each with potentially different attributes) in a single MR.
Following is the list of all possible attributes (which are also refered to as "slots") in ViGGO along with their types/possible values:
- 'name': The name of a video game (e.g., Rise of the Tomb Raider).
- 'release_year': The year a video game was released in (e.g., 2015).
- 'exp_release_date': For a not-yet-released game, the date when it is expected to be released (e.g., February 22, 2019). *Note: This slot cannot appear together with 'release_year' in the same dialogue act.*
- 'developer': The name of the studio/person that created the game (e.g., Crystal Dynamics).
- 'genres': A list of one or more genre labels from a set of possible values (e.g., action-adventure, shooter).
- 'player_perspective': A list of one or more perspectives from which the game is/can be played (possible values: first person, third person, side view, bird view).
- 'platforms': A list of one or more gaming platforms the game was officially released for (possible values: PC, PlayStation, Xbox, Nintendo, Nintendo Switch).
- 'esrb': A game's content rating as determined by the ESRB (possible values: E (for Everyone), E 10+ (for Everyone 10 and Older), T (for Teen), M (for Mature)).
- 'rating': Depending on the dialogue act this slot is used with, it is a categorical representation of either the game's average rating or the game's liking (possible values: excellent, good, average, poor).
- 'has_multiplayer': Indicates whether a game supports multiplayer or can only be played in single-player mode (possible values: yes, no).
- 'available_on_steam': Indicates whether a game can be purchased through the Steam digital distribution service (possible values: yes, no).
- 'has_linux_release': Indicates whether a game is supported on Linux operating systems (possible values: yes, no).
- 'has_mac_release': Indicates whether a game is supported on macOS (possible values: yes, no).
- 'specifier': A game specifier used by the 'request' DA, typically an adjective (e.g., addictive, easiest, overrated, visually impressive).
Each MR in the dataset has 3 distinct reference utterances, which are represented as 3 separate examples with the same MR.
#### Reason for Structure
The dataset structure mostly follows the format of the popular E2E dataset, however, with added dialogue act type indications, new list-type attributes introduced, and unified naming convention for multi-word attribute names.
#### Example Instance
#### Data Splits
ViGGO is split into 3 partitions, with no MRs in common between the training set and either of the validation and the test set (and that *after* delexicalizing the 'name' and 'developer' slots). The ratio of examples in the partitions is approximately 7.5 : 1 : 1.5, with their exact sizes listed below:
- Train: 5,103 (1,675 unique MRs)
- Validation: 714 (238 unique MRs)
- Test: 1,083 (359 unique MRs)
- TOTAL: 6,900 (2,253 unique MRs)
*Note: The reason why the number of unique MRs is not exactly one third of all examples is that for each 'request_attribute' DA (which only has one slot, and that without a value) 12 reference utterances were collected instead of 3.*
#### Splitting Criteria
A similar MR length and slot distribution was preserved across the partitions. The distribution of DA types, on the other hand, is skewed slightly toward fewer 'inform' DA instances (the most prevalent DA type) and a higher proportion of the less prevalent DAs in the validation and the test set.
####
## Dataset in GEM
### Rationale for Inclusion in GEM
#### Why is the Dataset in GEM?
ViGGO is a fairly small dataset but includes a greater variety of utterance types than most other datasets for NLG from structured meaning representations. This makes it more interesting from the perspective of model evaluation, since models have to learn to differentiate between various dialogue act types that share the same slots.
#### Similar Datasets
yes
#### Unique Language Coverage
no
#### Difference from other GEM datasets
ViGGO's language is more casual and conversational -- as opposed to information-seeking -- which differentiates it from the majority of popular datasets for the same type of data-to-text task. Moreover, the video game domain is a rather uncommon one in the NLG community, despite being very well-suited for data-to-text generation, considering it offers entities with many attributes to talk about, which can be described in a structured format.
### GEM-Specific Curation
#### Modificatied for GEM?
no
#### Additional Splits?
no
### Getting Started with the Task
#### Pointers to Resources
- E2E NLG Challenge
#### Technical Terms
- MR = meaning representation
- DA = dialogue act
## Previous Results
### Previous Results
#### Metrics
'BLEU', 'METEOR', 'ROUGE', 'BERT-Score', 'BLEURT', 'Other: Other Metrics'
#### Other Metrics
SER (slot error rate): Indicates the proportion of missing/incorrect/duplicate/hallucinated slot mentions in the utterances across a test set. The closer to zero a model scores in this metric, the more semantically accurate its outputs are. This metric is typically calculated either manually on a small sample of generated outputs, or heuristically using domain-specific regex rules and gazetteers.
#### Previous results available?
yes
#### Relevant Previous Results
- Juraska et al., 2019. ViGGO: A Video Game Corpus for Data-To-Text Generation in Open-Domain Conversation.
- Harkous et al., 2020. Have Your Text and Use It Too! End-to-End Neural Data-to-Text Generation with Semantic Fidelity.
- Kedzie and McKeown, 2020. Controllable Meaning Representation to Text Generation: Linearization and Data Augmentation Strategies.
- Juraska and Walker, 2021. Attention Is Indeed All You Need: Semantically Attention-Guided Decoding for Data-to-Text NLG.
## Dataset Curation
### Original Curation
#### Original Curation Rationale
The primary motivation behind ViGGO was to create a data-to-text corpus in a new but conversational domain, and intended for use in open-domain chatbots rather than task-oriented dialogue systems. To this end, the dataset contains utterances of 9 generalizable and conversational dialogue act types, revolving around various aspects of video games. The idea is that similar, relatively small datasets could fairly easily be collected for other conversational domains -- especially other entertainment domains (such as music or books), but perhaps also topics like animals or food -- to support an open-domain conversational agent with controllable neural NLG.
Another desired quality of the ViGGO dataset was cleanliness (no typos and grammatical errors) and semantic accuracy, which has often not been the case with other crowdsourced data-to-text corpora. In general, for the data-to-text generation task, there is arguably no need to put the burden on the generation model to figure out the noise, since the noise would not be expected to be there in a real-world system whose dialogue manager that creates the input for the NLG module is usually configurable and tightly controlled.
#### Communicative Goal
Produce a response from a structured meaning representation in the context of a conversation about video games. It can be a brief opinion or a description of a game, as well as a request for attribute (e.g., genre, player perspective, or platform) preference/confirmation or an inquiry about liking a particular type of games.
#### Sourced from Different Sources
no
### Language Data
#### How was Language Data Obtained?
'Crowdsourced'
#### Where was it crowdsourced?
'Amazon Mechanical Turk'
#### Language Producers
The paid crowdworkers who produced the reference utterances were from English-speaking countries, and they had at least 1,000 HITs approved and a HIT approval rate of 98% or more. Furthermore, in the instructions, crowdworkers were discouraged from taking on the task unless they considered themselves a gamer.
#### Topics Covered
The dataset focuses on video games and their various aspects, and hence the language of the utterances may contain video game-specific jargon.
#### Data Validation
validated by data curator
#### Data Preprocessing
First, regular expressions were used to enforce several standardization policies regarding special characters, punctuation, and the correction of undesired abbreviations/misspellings of standard domain-specific terms (e.g., terms like "Play station" or "PS4" would be changed to the uniform "PlayStation"). At the same time, hyphens were removed or enforced uniformly in certain terms, for example, "single-player". Although phrases such as "first person" should correctly have a hyphen when used as adjective, the crowdworkers used this rule very inconsistently. In order to avoid model outputs being penalized during the evaluation by the arbitrary choice of a hyphen presence or absence in the reference utterances, the hyphen was removed in all such phrases regardless of the noun vs. adjective use.
Second, an extensive set of heuristics was developed to identify slot-related errors. This process revealed the vast majority of missing or incorrect slot mentions, which were subsequently fixed according to the corresponding MRs. This eventually led to the development of a robust, cross-domain, heuristic slot aligner that can be used for automatic slot error rate evaluation. For details, see the appendix in Juraska and Walker, 2021.
Crowdworkers would sometimes also inject a piece of information which was not present in the MR, some of which is not even represented by any of the slots, e.g., plot or main characters. This unsolicited information was removed from the utterances so as to avoid confusing the neural model. Finally, any remaining typos and grammatical errors were resolved.
#### Was Data Filtered?
manually
#### Filter Criteria
Compliance with the indicated dialogue act type, semantic accuracy (i.e., all information in the corresponding MR mentioned and that correctly), and minimal extraneous information (e.g., personal experience/opinion). Whenever it was within a reasonable amount of effort, the utterances were manually fixed instead of being discarded/crowdsourced anew.
### Structured Annotations
#### Additional Annotations?
none
#### Annotation Service?
no
### Consent
#### Any Consent Policy?
no
### Private Identifying Information (PII)
#### Contains PII?
no PII
#### Justification for no PII
Crowdworkers were instructed to only express the information in the provided meaning representation, which never prompted them to mention anything about themselves. Occasionally, they would still include a bit of personal experience (e.g., "I used to like the game as a kid.") or opinion, but these would be too general to be considered PII.
### Maintenance
#### Any Maintenance Plan?
no
## Broader Social Context
### Previous Work on the Social Impact of the Dataset
#### Usage of Models based on the Data
no
### Impact on Under-Served Communities
#### Addresses needs of underserved Communities?
no
### Discussion of Biases
#### Any Documented Social Biases?
no
## Considerations for Using the Data
### PII Risks and Liability
### Licenses
### Known Technical Limitations
#### Technical Limitations
The dataset is limited to a single domain: video games. One caveat of using a language generator trained on this dataset in a dialogue system as-is is that multiple subsequent turns discussing the same video game would be repeating its full name. ViGGO was designed for generation without context, and therefore it is up to the dialogue manager to ensure that pronouns are substituted for the names whenever it would sound more natural in a dialogue. Alternately, the dataset can easily be augmented with automatically constructed samples which omit the 'name' slot in the MR and replace the name with a pronoun in the reference utterance.
| [
"# Dataset Card for GEM/viggo",
"## Dataset Description\n\n- Homepage: URL\n- Repository: \n- Paper: URL\n- Leaderboard: N/A\n- Point of Contact: Juraj Juraska",
"### Link to Main Data Card\n\nYou can find the main data card on the GEM Website.",
"### Dataset Summary \n\nViGGO is an English data-to-text generation dataset in the video game domain, with target responses being more conversational than information-seeking, yet constrained to the information presented in a meaning representation. The dataset is relatively small with about 5,000 datasets but very clean, and can thus serve for evaluating transfer learning, low-resource, or few-shot capabilities of neural models.\n\nYou can load the dataset via:\n\nThe data loader can be found here.",
"#### website\nWesbite",
"#### paper\nACL Anthology",
"#### authors\nJuraj Juraska, Kevin K. Bowden, Marilyn Walker",
"## Dataset Overview",
"### Where to find the Data and its Documentation",
"#### Webpage\n\n\n\nWesbite",
"#### Paper\n\n\n\nACL Anthology",
"#### BibTex",
"#### Contact Name\n\n\n\n\nJuraj Juraska",
"#### Contact Email\n\n\n\njjuraska@URL",
"#### Has a Leaderboard?\n\n\n\nno",
"### Languages and Intended Use",
"#### Multilingual?\n\n\n\n\nno",
"#### Covered Languages\n\n\n\n\n'English'",
"#### License\n\n\n\n\ncc-by-sa-4.0: Creative Commons Attribution Share Alike 4.0 International",
"#### Intended Use\n\n\n\nViGGO was designed for the task of data-to-text generation in chatbots (as opposed to task-oriented dialogue systems), with target responses being more conversational than information-seeking, yet constrained to the information presented in a meaning representation. The dataset, being relatively small and clean, can also serve for demonstrating transfer learning capabilities of neural models.",
"#### Primary Task\n\n\n\nData-to-Text",
"### Credit",
"#### Curation Organization Type(s)\n\n\n\n'academic'",
"#### Curation Organization(s)\n\n\n\nUniversity of California, Santa Cruz",
"#### Dataset Creators\n\n\n\nJuraj Juraska, Kevin K. Bowden, Marilyn Walker",
"#### Who added the Dataset to GEM?\n\n\n\nJuraj Juraska",
"### Dataset Structure",
"#### Data Fields\n\n\n\nEach example in the dataset has the following two fields:\n\n- 'mr': A meaning representation (MR) that, in a structured format, provides the information to convey, as well as the desired dialogue act (DA) type.\n- 'ref': A reference output, i.e., a corresponding utterance realizing all the information in the MR.\n\nEach MR is a flattened dictionary of attribute-and-value pairs, \"wrapped\" in the dialogue act type indication. This format was chosen primarily for its compactness, but also to allow for easy concatenation of multiple DAs (each with potentially different attributes) in a single MR.\n\nFollowing is the list of all possible attributes (which are also refered to as \"slots\") in ViGGO along with their types/possible values:\n\n- 'name': The name of a video game (e.g., Rise of the Tomb Raider).\n- 'release_year': The year a video game was released in (e.g., 2015).\n- 'exp_release_date': For a not-yet-released game, the date when it is expected to be released (e.g., February 22, 2019). *Note: This slot cannot appear together with 'release_year' in the same dialogue act.*\n- 'developer': The name of the studio/person that created the game (e.g., Crystal Dynamics).\n- 'genres': A list of one or more genre labels from a set of possible values (e.g., action-adventure, shooter).\n- 'player_perspective': A list of one or more perspectives from which the game is/can be played (possible values: first person, third person, side view, bird view).\n- 'platforms': A list of one or more gaming platforms the game was officially released for (possible values: PC, PlayStation, Xbox, Nintendo, Nintendo Switch).\n- 'esrb': A game's content rating as determined by the ESRB (possible values: E (for Everyone), E 10+ (for Everyone 10 and Older), T (for Teen), M (for Mature)).\n- 'rating': Depending on the dialogue act this slot is used with, it is a categorical representation of either the game's average rating or the game's liking (possible values: excellent, good, average, poor).\n- 'has_multiplayer': Indicates whether a game supports multiplayer or can only be played in single-player mode (possible values: yes, no).\n- 'available_on_steam': Indicates whether a game can be purchased through the Steam digital distribution service (possible values: yes, no).\n- 'has_linux_release': Indicates whether a game is supported on Linux operating systems (possible values: yes, no).\n- 'has_mac_release': Indicates whether a game is supported on macOS (possible values: yes, no).\n- 'specifier': A game specifier used by the 'request' DA, typically an adjective (e.g., addictive, easiest, overrated, visually impressive).\n\nEach MR in the dataset has 3 distinct reference utterances, which are represented as 3 separate examples with the same MR.",
"#### Reason for Structure\n\n\n\nThe dataset structure mostly follows the format of the popular E2E dataset, however, with added dialogue act type indications, new list-type attributes introduced, and unified naming convention for multi-word attribute names.",
"#### Example Instance",
"#### Data Splits\n\n\n\nViGGO is split into 3 partitions, with no MRs in common between the training set and either of the validation and the test set (and that *after* delexicalizing the 'name' and 'developer' slots). The ratio of examples in the partitions is approximately 7.5 : 1 : 1.5, with their exact sizes listed below:\n\n- Train: 5,103 (1,675 unique MRs)\n- Validation: 714 (238 unique MRs)\n- Test: 1,083 (359 unique MRs)\n- TOTAL: 6,900 (2,253 unique MRs)\n\n*Note: The reason why the number of unique MRs is not exactly one third of all examples is that for each 'request_attribute' DA (which only has one slot, and that without a value) 12 reference utterances were collected instead of 3.*",
"#### Splitting Criteria\n\n\n\nA similar MR length and slot distribution was preserved across the partitions. The distribution of DA types, on the other hand, is skewed slightly toward fewer 'inform' DA instances (the most prevalent DA type) and a higher proportion of the less prevalent DAs in the validation and the test set.",
"####",
"## Dataset in GEM",
"### Rationale for Inclusion in GEM",
"#### Why is the Dataset in GEM?\n\n\n\nViGGO is a fairly small dataset but includes a greater variety of utterance types than most other datasets for NLG from structured meaning representations. This makes it more interesting from the perspective of model evaluation, since models have to learn to differentiate between various dialogue act types that share the same slots.",
"#### Similar Datasets\n\n\n\nyes",
"#### Unique Language Coverage\n\n\n\nno",
"#### Difference from other GEM datasets\n\n\n\nViGGO's language is more casual and conversational -- as opposed to information-seeking -- which differentiates it from the majority of popular datasets for the same type of data-to-text task. Moreover, the video game domain is a rather uncommon one in the NLG community, despite being very well-suited for data-to-text generation, considering it offers entities with many attributes to talk about, which can be described in a structured format.",
"### GEM-Specific Curation",
"#### Modificatied for GEM?\n\n\n\nno",
"#### Additional Splits?\n\n\n\nno",
"### Getting Started with the Task",
"#### Pointers to Resources\n\n\n\n- E2E NLG Challenge",
"#### Technical Terms\n\n\n\n- MR = meaning representation\n- DA = dialogue act",
"## Previous Results",
"### Previous Results",
"#### Metrics\n\n\n\n'BLEU', 'METEOR', 'ROUGE', 'BERT-Score', 'BLEURT', 'Other: Other Metrics'",
"#### Other Metrics\n\n\n\nSER (slot error rate): Indicates the proportion of missing/incorrect/duplicate/hallucinated slot mentions in the utterances across a test set. The closer to zero a model scores in this metric, the more semantically accurate its outputs are. This metric is typically calculated either manually on a small sample of generated outputs, or heuristically using domain-specific regex rules and gazetteers.",
"#### Previous results available?\n\n\n\nyes",
"#### Relevant Previous Results\n\n\n\n- Juraska et al., 2019. ViGGO: A Video Game Corpus for Data-To-Text Generation in Open-Domain Conversation.\n- Harkous et al., 2020. Have Your Text and Use It Too! End-to-End Neural Data-to-Text Generation with Semantic Fidelity.\n- Kedzie and McKeown, 2020. Controllable Meaning Representation to Text Generation: Linearization and Data Augmentation Strategies.\n- Juraska and Walker, 2021. Attention Is Indeed All You Need: Semantically Attention-Guided Decoding for Data-to-Text NLG.",
"## Dataset Curation",
"### Original Curation",
"#### Original Curation Rationale\n\n\n\nThe primary motivation behind ViGGO was to create a data-to-text corpus in a new but conversational domain, and intended for use in open-domain chatbots rather than task-oriented dialogue systems. To this end, the dataset contains utterances of 9 generalizable and conversational dialogue act types, revolving around various aspects of video games. The idea is that similar, relatively small datasets could fairly easily be collected for other conversational domains -- especially other entertainment domains (such as music or books), but perhaps also topics like animals or food -- to support an open-domain conversational agent with controllable neural NLG.\n\nAnother desired quality of the ViGGO dataset was cleanliness (no typos and grammatical errors) and semantic accuracy, which has often not been the case with other crowdsourced data-to-text corpora. In general, for the data-to-text generation task, there is arguably no need to put the burden on the generation model to figure out the noise, since the noise would not be expected to be there in a real-world system whose dialogue manager that creates the input for the NLG module is usually configurable and tightly controlled.",
"#### Communicative Goal\n\n\n\nProduce a response from a structured meaning representation in the context of a conversation about video games. It can be a brief opinion or a description of a game, as well as a request for attribute (e.g., genre, player perspective, or platform) preference/confirmation or an inquiry about liking a particular type of games.",
"#### Sourced from Different Sources\n\n\n\nno",
"### Language Data",
"#### How was Language Data Obtained?\n\n\n\n'Crowdsourced'",
"#### Where was it crowdsourced?\n\n\n\n'Amazon Mechanical Turk'",
"#### Language Producers\n\n\n\nThe paid crowdworkers who produced the reference utterances were from English-speaking countries, and they had at least 1,000 HITs approved and a HIT approval rate of 98% or more. Furthermore, in the instructions, crowdworkers were discouraged from taking on the task unless they considered themselves a gamer.",
"#### Topics Covered\n\n\n\nThe dataset focuses on video games and their various aspects, and hence the language of the utterances may contain video game-specific jargon.",
"#### Data Validation\n\n\n\nvalidated by data curator",
"#### Data Preprocessing\n\n\n\nFirst, regular expressions were used to enforce several standardization policies regarding special characters, punctuation, and the correction of undesired abbreviations/misspellings of standard domain-specific terms (e.g., terms like \"Play station\" or \"PS4\" would be changed to the uniform \"PlayStation\"). At the same time, hyphens were removed or enforced uniformly in certain terms, for example, \"single-player\". Although phrases such as \"first person\" should correctly have a hyphen when used as adjective, the crowdworkers used this rule very inconsistently. In order to avoid model outputs being penalized during the evaluation by the arbitrary choice of a hyphen presence or absence in the reference utterances, the hyphen was removed in all such phrases regardless of the noun vs. adjective use.\n\nSecond, an extensive set of heuristics was developed to identify slot-related errors. This process revealed the vast majority of missing or incorrect slot mentions, which were subsequently fixed according to the corresponding MRs. This eventually led to the development of a robust, cross-domain, heuristic slot aligner that can be used for automatic slot error rate evaluation. For details, see the appendix in Juraska and Walker, 2021.\n\nCrowdworkers would sometimes also inject a piece of information which was not present in the MR, some of which is not even represented by any of the slots, e.g., plot or main characters. This unsolicited information was removed from the utterances so as to avoid confusing the neural model. Finally, any remaining typos and grammatical errors were resolved.",
"#### Was Data Filtered?\n\n\n\nmanually",
"#### Filter Criteria\n\n\n\nCompliance with the indicated dialogue act type, semantic accuracy (i.e., all information in the corresponding MR mentioned and that correctly), and minimal extraneous information (e.g., personal experience/opinion). Whenever it was within a reasonable amount of effort, the utterances were manually fixed instead of being discarded/crowdsourced anew.",
"### Structured Annotations",
"#### Additional Annotations?\n\n\n\n\nnone",
"#### Annotation Service?\n\n\n\nno",
"### Consent",
"#### Any Consent Policy?\n\n\n\nno",
"### Private Identifying Information (PII)",
"#### Contains PII?\n\n\n\n\nno PII",
"#### Justification for no PII\n\n\n\nCrowdworkers were instructed to only express the information in the provided meaning representation, which never prompted them to mention anything about themselves. Occasionally, they would still include a bit of personal experience (e.g., \"I used to like the game as a kid.\") or opinion, but these would be too general to be considered PII.",
"### Maintenance",
"#### Any Maintenance Plan?\n\n\n\nno",
"## Broader Social Context",
"### Previous Work on the Social Impact of the Dataset",
"#### Usage of Models based on the Data\n\n\n\nno",
"### Impact on Under-Served Communities",
"#### Addresses needs of underserved Communities?\n\n\n\nno",
"### Discussion of Biases",
"#### Any Documented Social Biases?\n\n\n\nno",
"## Considerations for Using the Data",
"### PII Risks and Liability",
"### Licenses",
"### Known Technical Limitations",
"#### Technical Limitations\n\n\n\nThe dataset is limited to a single domain: video games. One caveat of using a language generator trained on this dataset in a dialogue system as-is is that multiple subsequent turns discussing the same video game would be repeating its full name. ViGGO was designed for generation without context, and therefore it is up to the dialogue manager to ensure that pronouns are substituted for the names whenever it would sound more natural in a dialogue. Alternately, the dataset can easily be augmented with automatically constructed samples which omit the 'name' slot in the MR and replace the name with a pronoun in the reference utterance."
] | [
"TAGS\n#task_categories-table-to-text #annotations_creators-none #language_creators-unknown #multilinguality-unknown #size_categories-unknown #source_datasets-original #language-English #license-cc-by-sa-4.0 #data-to-text #region-us \n",
"# Dataset Card for GEM/viggo",
"## Dataset Description\n\n- Homepage: URL\n- Repository: \n- Paper: URL\n- Leaderboard: N/A\n- Point of Contact: Juraj Juraska",
"### Link to Main Data Card\n\nYou can find the main data card on the GEM Website.",
"### Dataset Summary \n\nViGGO is an English data-to-text generation dataset in the video game domain, with target responses being more conversational than information-seeking, yet constrained to the information presented in a meaning representation. The dataset is relatively small with about 5,000 datasets but very clean, and can thus serve for evaluating transfer learning, low-resource, or few-shot capabilities of neural models.\n\nYou can load the dataset via:\n\nThe data loader can be found here.",
"#### website\nWesbite",
"#### paper\nACL Anthology",
"#### authors\nJuraj Juraska, Kevin K. Bowden, Marilyn Walker",
"## Dataset Overview",
"### Where to find the Data and its Documentation",
"#### Webpage\n\n\n\nWesbite",
"#### Paper\n\n\n\nACL Anthology",
"#### BibTex",
"#### Contact Name\n\n\n\n\nJuraj Juraska",
"#### Contact Email\n\n\n\njjuraska@URL",
"#### Has a Leaderboard?\n\n\n\nno",
"### Languages and Intended Use",
"#### Multilingual?\n\n\n\n\nno",
"#### Covered Languages\n\n\n\n\n'English'",
"#### License\n\n\n\n\ncc-by-sa-4.0: Creative Commons Attribution Share Alike 4.0 International",
"#### Intended Use\n\n\n\nViGGO was designed for the task of data-to-text generation in chatbots (as opposed to task-oriented dialogue systems), with target responses being more conversational than information-seeking, yet constrained to the information presented in a meaning representation. The dataset, being relatively small and clean, can also serve for demonstrating transfer learning capabilities of neural models.",
"#### Primary Task\n\n\n\nData-to-Text",
"### Credit",
"#### Curation Organization Type(s)\n\n\n\n'academic'",
"#### Curation Organization(s)\n\n\n\nUniversity of California, Santa Cruz",
"#### Dataset Creators\n\n\n\nJuraj Juraska, Kevin K. Bowden, Marilyn Walker",
"#### Who added the Dataset to GEM?\n\n\n\nJuraj Juraska",
"### Dataset Structure",
"#### Data Fields\n\n\n\nEach example in the dataset has the following two fields:\n\n- 'mr': A meaning representation (MR) that, in a structured format, provides the information to convey, as well as the desired dialogue act (DA) type.\n- 'ref': A reference output, i.e., a corresponding utterance realizing all the information in the MR.\n\nEach MR is a flattened dictionary of attribute-and-value pairs, \"wrapped\" in the dialogue act type indication. This format was chosen primarily for its compactness, but also to allow for easy concatenation of multiple DAs (each with potentially different attributes) in a single MR.\n\nFollowing is the list of all possible attributes (which are also refered to as \"slots\") in ViGGO along with their types/possible values:\n\n- 'name': The name of a video game (e.g., Rise of the Tomb Raider).\n- 'release_year': The year a video game was released in (e.g., 2015).\n- 'exp_release_date': For a not-yet-released game, the date when it is expected to be released (e.g., February 22, 2019). *Note: This slot cannot appear together with 'release_year' in the same dialogue act.*\n- 'developer': The name of the studio/person that created the game (e.g., Crystal Dynamics).\n- 'genres': A list of one or more genre labels from a set of possible values (e.g., action-adventure, shooter).\n- 'player_perspective': A list of one or more perspectives from which the game is/can be played (possible values: first person, third person, side view, bird view).\n- 'platforms': A list of one or more gaming platforms the game was officially released for (possible values: PC, PlayStation, Xbox, Nintendo, Nintendo Switch).\n- 'esrb': A game's content rating as determined by the ESRB (possible values: E (for Everyone), E 10+ (for Everyone 10 and Older), T (for Teen), M (for Mature)).\n- 'rating': Depending on the dialogue act this slot is used with, it is a categorical representation of either the game's average rating or the game's liking (possible values: excellent, good, average, poor).\n- 'has_multiplayer': Indicates whether a game supports multiplayer or can only be played in single-player mode (possible values: yes, no).\n- 'available_on_steam': Indicates whether a game can be purchased through the Steam digital distribution service (possible values: yes, no).\n- 'has_linux_release': Indicates whether a game is supported on Linux operating systems (possible values: yes, no).\n- 'has_mac_release': Indicates whether a game is supported on macOS (possible values: yes, no).\n- 'specifier': A game specifier used by the 'request' DA, typically an adjective (e.g., addictive, easiest, overrated, visually impressive).\n\nEach MR in the dataset has 3 distinct reference utterances, which are represented as 3 separate examples with the same MR.",
"#### Reason for Structure\n\n\n\nThe dataset structure mostly follows the format of the popular E2E dataset, however, with added dialogue act type indications, new list-type attributes introduced, and unified naming convention for multi-word attribute names.",
"#### Example Instance",
"#### Data Splits\n\n\n\nViGGO is split into 3 partitions, with no MRs in common between the training set and either of the validation and the test set (and that *after* delexicalizing the 'name' and 'developer' slots). The ratio of examples in the partitions is approximately 7.5 : 1 : 1.5, with their exact sizes listed below:\n\n- Train: 5,103 (1,675 unique MRs)\n- Validation: 714 (238 unique MRs)\n- Test: 1,083 (359 unique MRs)\n- TOTAL: 6,900 (2,253 unique MRs)\n\n*Note: The reason why the number of unique MRs is not exactly one third of all examples is that for each 'request_attribute' DA (which only has one slot, and that without a value) 12 reference utterances were collected instead of 3.*",
"#### Splitting Criteria\n\n\n\nA similar MR length and slot distribution was preserved across the partitions. The distribution of DA types, on the other hand, is skewed slightly toward fewer 'inform' DA instances (the most prevalent DA type) and a higher proportion of the less prevalent DAs in the validation and the test set.",
"####",
"## Dataset in GEM",
"### Rationale for Inclusion in GEM",
"#### Why is the Dataset in GEM?\n\n\n\nViGGO is a fairly small dataset but includes a greater variety of utterance types than most other datasets for NLG from structured meaning representations. This makes it more interesting from the perspective of model evaluation, since models have to learn to differentiate between various dialogue act types that share the same slots.",
"#### Similar Datasets\n\n\n\nyes",
"#### Unique Language Coverage\n\n\n\nno",
"#### Difference from other GEM datasets\n\n\n\nViGGO's language is more casual and conversational -- as opposed to information-seeking -- which differentiates it from the majority of popular datasets for the same type of data-to-text task. Moreover, the video game domain is a rather uncommon one in the NLG community, despite being very well-suited for data-to-text generation, considering it offers entities with many attributes to talk about, which can be described in a structured format.",
"### GEM-Specific Curation",
"#### Modificatied for GEM?\n\n\n\nno",
"#### Additional Splits?\n\n\n\nno",
"### Getting Started with the Task",
"#### Pointers to Resources\n\n\n\n- E2E NLG Challenge",
"#### Technical Terms\n\n\n\n- MR = meaning representation\n- DA = dialogue act",
"## Previous Results",
"### Previous Results",
"#### Metrics\n\n\n\n'BLEU', 'METEOR', 'ROUGE', 'BERT-Score', 'BLEURT', 'Other: Other Metrics'",
"#### Other Metrics\n\n\n\nSER (slot error rate): Indicates the proportion of missing/incorrect/duplicate/hallucinated slot mentions in the utterances across a test set. The closer to zero a model scores in this metric, the more semantically accurate its outputs are. This metric is typically calculated either manually on a small sample of generated outputs, or heuristically using domain-specific regex rules and gazetteers.",
"#### Previous results available?\n\n\n\nyes",
"#### Relevant Previous Results\n\n\n\n- Juraska et al., 2019. ViGGO: A Video Game Corpus for Data-To-Text Generation in Open-Domain Conversation.\n- Harkous et al., 2020. Have Your Text and Use It Too! End-to-End Neural Data-to-Text Generation with Semantic Fidelity.\n- Kedzie and McKeown, 2020. Controllable Meaning Representation to Text Generation: Linearization and Data Augmentation Strategies.\n- Juraska and Walker, 2021. Attention Is Indeed All You Need: Semantically Attention-Guided Decoding for Data-to-Text NLG.",
"## Dataset Curation",
"### Original Curation",
"#### Original Curation Rationale\n\n\n\nThe primary motivation behind ViGGO was to create a data-to-text corpus in a new but conversational domain, and intended for use in open-domain chatbots rather than task-oriented dialogue systems. To this end, the dataset contains utterances of 9 generalizable and conversational dialogue act types, revolving around various aspects of video games. The idea is that similar, relatively small datasets could fairly easily be collected for other conversational domains -- especially other entertainment domains (such as music or books), but perhaps also topics like animals or food -- to support an open-domain conversational agent with controllable neural NLG.\n\nAnother desired quality of the ViGGO dataset was cleanliness (no typos and grammatical errors) and semantic accuracy, which has often not been the case with other crowdsourced data-to-text corpora. In general, for the data-to-text generation task, there is arguably no need to put the burden on the generation model to figure out the noise, since the noise would not be expected to be there in a real-world system whose dialogue manager that creates the input for the NLG module is usually configurable and tightly controlled.",
"#### Communicative Goal\n\n\n\nProduce a response from a structured meaning representation in the context of a conversation about video games. It can be a brief opinion or a description of a game, as well as a request for attribute (e.g., genre, player perspective, or platform) preference/confirmation or an inquiry about liking a particular type of games.",
"#### Sourced from Different Sources\n\n\n\nno",
"### Language Data",
"#### How was Language Data Obtained?\n\n\n\n'Crowdsourced'",
"#### Where was it crowdsourced?\n\n\n\n'Amazon Mechanical Turk'",
"#### Language Producers\n\n\n\nThe paid crowdworkers who produced the reference utterances were from English-speaking countries, and they had at least 1,000 HITs approved and a HIT approval rate of 98% or more. Furthermore, in the instructions, crowdworkers were discouraged from taking on the task unless they considered themselves a gamer.",
"#### Topics Covered\n\n\n\nThe dataset focuses on video games and their various aspects, and hence the language of the utterances may contain video game-specific jargon.",
"#### Data Validation\n\n\n\nvalidated by data curator",
"#### Data Preprocessing\n\n\n\nFirst, regular expressions were used to enforce several standardization policies regarding special characters, punctuation, and the correction of undesired abbreviations/misspellings of standard domain-specific terms (e.g., terms like \"Play station\" or \"PS4\" would be changed to the uniform \"PlayStation\"). At the same time, hyphens were removed or enforced uniformly in certain terms, for example, \"single-player\". Although phrases such as \"first person\" should correctly have a hyphen when used as adjective, the crowdworkers used this rule very inconsistently. In order to avoid model outputs being penalized during the evaluation by the arbitrary choice of a hyphen presence or absence in the reference utterances, the hyphen was removed in all such phrases regardless of the noun vs. adjective use.\n\nSecond, an extensive set of heuristics was developed to identify slot-related errors. This process revealed the vast majority of missing or incorrect slot mentions, which were subsequently fixed according to the corresponding MRs. This eventually led to the development of a robust, cross-domain, heuristic slot aligner that can be used for automatic slot error rate evaluation. For details, see the appendix in Juraska and Walker, 2021.\n\nCrowdworkers would sometimes also inject a piece of information which was not present in the MR, some of which is not even represented by any of the slots, e.g., plot or main characters. This unsolicited information was removed from the utterances so as to avoid confusing the neural model. Finally, any remaining typos and grammatical errors were resolved.",
"#### Was Data Filtered?\n\n\n\nmanually",
"#### Filter Criteria\n\n\n\nCompliance with the indicated dialogue act type, semantic accuracy (i.e., all information in the corresponding MR mentioned and that correctly), and minimal extraneous information (e.g., personal experience/opinion). Whenever it was within a reasonable amount of effort, the utterances were manually fixed instead of being discarded/crowdsourced anew.",
"### Structured Annotations",
"#### Additional Annotations?\n\n\n\n\nnone",
"#### Annotation Service?\n\n\n\nno",
"### Consent",
"#### Any Consent Policy?\n\n\n\nno",
"### Private Identifying Information (PII)",
"#### Contains PII?\n\n\n\n\nno PII",
"#### Justification for no PII\n\n\n\nCrowdworkers were instructed to only express the information in the provided meaning representation, which never prompted them to mention anything about themselves. Occasionally, they would still include a bit of personal experience (e.g., \"I used to like the game as a kid.\") or opinion, but these would be too general to be considered PII.",
"### Maintenance",
"#### Any Maintenance Plan?\n\n\n\nno",
"## Broader Social Context",
"### Previous Work on the Social Impact of the Dataset",
"#### Usage of Models based on the Data\n\n\n\nno",
"### Impact on Under-Served Communities",
"#### Addresses needs of underserved Communities?\n\n\n\nno",
"### Discussion of Biases",
"#### Any Documented Social Biases?\n\n\n\nno",
"## Considerations for Using the Data",
"### PII Risks and Liability",
"### Licenses",
"### Known Technical Limitations",
"#### Technical Limitations\n\n\n\nThe dataset is limited to a single domain: video games. One caveat of using a language generator trained on this dataset in a dialogue system as-is is that multiple subsequent turns discussing the same video game would be repeating its full name. ViGGO was designed for generation without context, and therefore it is up to the dialogue manager to ensure that pronouns are substituted for the names whenever it would sound more natural in a dialogue. Alternately, the dataset can easily be augmented with automatically constructed samples which omit the 'name' slot in the MR and replace the name with a pronoun in the reference utterance."
] | [
85,
10,
33,
20,
117,
6,
7,
18,
5,
11,
7,
7,
6,
8,
10,
8,
9,
7,
9,
20,
92,
11,
3,
14,
14,
20,
15,
7,
758,
57,
6,
192,
74,
2,
6,
12,
77,
7,
8,
117,
10,
10,
9,
9,
14,
16,
3,
4,
45,
105,
7,
144,
5,
5,
280,
82,
9,
4,
17,
15,
76,
38,
12,
382,
9,
89,
8,
11,
7,
4,
8,
10,
10,
86,
5,
9,
6,
12,
12,
10,
13,
8,
11,
8,
9,
4,
8,
145
] | [
"passage: TAGS\n#task_categories-table-to-text #annotations_creators-none #language_creators-unknown #multilinguality-unknown #size_categories-unknown #source_datasets-original #language-English #license-cc-by-sa-4.0 #data-to-text #region-us \n# Dataset Card for GEM/viggo## Dataset Description\n\n- Homepage: URL\n- Repository: \n- Paper: URL\n- Leaderboard: N/A\n- Point of Contact: Juraj Juraska### Link to Main Data Card\n\nYou can find the main data card on the GEM Website.### Dataset Summary \n\nViGGO is an English data-to-text generation dataset in the video game domain, with target responses being more conversational than information-seeking, yet constrained to the information presented in a meaning representation. The dataset is relatively small with about 5,000 datasets but very clean, and can thus serve for evaluating transfer learning, low-resource, or few-shot capabilities of neural models.\n\nYou can load the dataset via:\n\nThe data loader can be found here.#### website\nWesbite#### paper\nACL Anthology#### authors\nJuraj Juraska, Kevin K. Bowden, Marilyn Walker## Dataset Overview### Where to find the Data and its Documentation#### Webpage\n\n\n\nWesbite#### Paper\n\n\n\nACL Anthology#### BibTex#### Contact Name\n\n\n\n\nJuraj Juraska#### Contact Email\n\n\n\njjuraska@URL#### Has a Leaderboard?\n\n\n\nno### Languages and Intended Use#### Multilingual?\n\n\n\n\nno#### Covered Languages\n\n\n\n\n'English'#### License\n\n\n\n\ncc-by-sa-4.0: Creative Commons Attribution Share Alike 4.0 International#### Intended Use\n\n\n\nViGGO was designed for the task of data-to-text generation in chatbots (as opposed to task-oriented dialogue systems), with target responses being more conversational than information-seeking, yet constrained to the information presented in a meaning representation. The dataset, being relatively small and clean, can also serve for demonstrating transfer learning capabilities of neural models.#### Primary Task\n\n\n\nData-to-Text",
"passage: ### Credit#### Curation Organization Type(s)\n\n\n\n'academic'#### Curation Organization(s)\n\n\n\nUniversity of California, Santa Cruz#### Dataset Creators\n\n\n\nJuraj Juraska, Kevin K. Bowden, Marilyn Walker#### Who added the Dataset to GEM?\n\n\n\nJuraj Juraska### Dataset Structure",
"passage: #### Data Fields\n\n\n\nEach example in the dataset has the following two fields:\n\n- 'mr': A meaning representation (MR) that, in a structured format, provides the information to convey, as well as the desired dialogue act (DA) type.\n- 'ref': A reference output, i.e., a corresponding utterance realizing all the information in the MR.\n\nEach MR is a flattened dictionary of attribute-and-value pairs, \"wrapped\" in the dialogue act type indication. This format was chosen primarily for its compactness, but also to allow for easy concatenation of multiple DAs (each with potentially different attributes) in a single MR.\n\nFollowing is the list of all possible attributes (which are also refered to as \"slots\") in ViGGO along with their types/possible values:\n\n- 'name': The name of a video game (e.g., Rise of the Tomb Raider).\n- 'release_year': The year a video game was released in (e.g., 2015).\n- 'exp_release_date': For a not-yet-released game, the date when it is expected to be released (e.g., February 22, 2019). *Note: This slot cannot appear together with 'release_year' in the same dialogue act.*\n- 'developer': The name of the studio/person that created the game (e.g., Crystal Dynamics).\n- 'genres': A list of one or more genre labels from a set of possible values (e.g., action-adventure, shooter).\n- 'player_perspective': A list of one or more perspectives from which the game is/can be played (possible values: first person, third person, side view, bird view).\n- 'platforms': A list of one or more gaming platforms the game was officially released for (possible values: PC, PlayStation, Xbox, Nintendo, Nintendo Switch).\n- 'esrb': A game's content rating as determined by the ESRB (possible values: E (for Everyone), E 10+ (for Everyone 10 and Older), T (for Teen), M (for Mature)).\n- 'rating': Depending on the dialogue act this slot is used with, it is a categorical representation of either the game's average rating or the game's liking (possible values: excellent, good, average, poor).\n- 'has_multiplayer': Indicates whether a game supports multiplayer or can only be played in single-player mode (possible values: yes, no).\n- 'available_on_steam': Indicates whether a game can be purchased through the Steam digital distribution service (possible values: yes, no).\n- 'has_linux_release': Indicates whether a game is supported on Linux operating systems (possible values: yes, no).\n- 'has_mac_release': Indicates whether a game is supported on macOS (possible values: yes, no).\n- 'specifier': A game specifier used by the 'request' DA, typically an adjective (e.g., addictive, easiest, overrated, visually impressive).\n\nEach MR in the dataset has 3 distinct reference utterances, which are represented as 3 separate examples with the same MR.#### Reason for Structure\n\n\n\nThe dataset structure mostly follows the format of the popular E2E dataset, however, with added dialogue act type indications, new list-type attributes introduced, and unified naming convention for multi-word attribute names.#### Example Instance#### Data Splits\n\n\n\nViGGO is split into 3 partitions, with no MRs in common between the training set and either of the validation and the test set (and that *after* delexicalizing the 'name' and 'developer' slots). The ratio of examples in the partitions is approximately 7.5 : 1 : 1.5, with their exact sizes listed below:\n\n- Train: 5,103 (1,675 unique MRs)\n- Validation: 714 (238 unique MRs)\n- Test: 1,083 (359 unique MRs)\n- TOTAL: 6,900 (2,253 unique MRs)\n\n*Note: The reason why the number of unique MRs is not exactly one third of all examples is that for each 'request_attribute' DA (which only has one slot, and that without a value) 12 reference utterances were collected instead of 3.*#### Splitting Criteria\n\n\n\nA similar MR length and slot distribution was preserved across the partitions. The distribution of DA types, on the other hand, is skewed slightly toward fewer 'inform' DA instances (the most prevalent DA type) and a higher proportion of the less prevalent DAs in the validation and the test set.###### Dataset in GEM### Rationale for Inclusion in GEM#### Why is the Dataset in GEM?\n\n\n\nViGGO is a fairly small dataset but includes a greater variety of utterance types than most other datasets for NLG from structured meaning representations. This makes it more interesting from the perspective of model evaluation, since models have to learn to differentiate between various dialogue act types that share the same slots.#### Similar Datasets\n\n\n\nyes#### Unique Language Coverage\n\n\n\nno",
"passage: #### Difference from other GEM datasets\n\n\n\nViGGO's language is more casual and conversational -- as opposed to information-seeking -- which differentiates it from the majority of popular datasets for the same type of data-to-text task. Moreover, the video game domain is a rather uncommon one in the NLG community, despite being very well-suited for data-to-text generation, considering it offers entities with many attributes to talk about, which can be described in a structured format.### GEM-Specific Curation#### Modificatied for GEM?\n\n\n\nno#### Additional Splits?\n\n\n\nno### Getting Started with the Task#### Pointers to Resources\n\n\n\n- E2E NLG Challenge#### Technical Terms\n\n\n\n- MR = meaning representation\n- DA = dialogue act## Previous Results### Previous Results#### Metrics\n\n\n\n'BLEU', 'METEOR', 'ROUGE', 'BERT-Score', 'BLEURT', 'Other: Other Metrics'#### Other Metrics\n\n\n\nSER (slot error rate): Indicates the proportion of missing/incorrect/duplicate/hallucinated slot mentions in the utterances across a test set. The closer to zero a model scores in this metric, the more semantically accurate its outputs are. This metric is typically calculated either manually on a small sample of generated outputs, or heuristically using domain-specific regex rules and gazetteers.#### Previous results available?\n\n\n\nyes#### Relevant Previous Results\n\n\n\n- Juraska et al., 2019. ViGGO: A Video Game Corpus for Data-To-Text Generation in Open-Domain Conversation.\n- Harkous et al., 2020. Have Your Text and Use It Too! End-to-End Neural Data-to-Text Generation with Semantic Fidelity.\n- Kedzie and McKeown, 2020. Controllable Meaning Representation to Text Generation: Linearization and Data Augmentation Strategies.\n- Juraska and Walker, 2021. Attention Is Indeed All You Need: Semantically Attention-Guided Decoding for Data-to-Text NLG.## Dataset Curation### Original Curation",
"passage: #### Original Curation Rationale\n\n\n\nThe primary motivation behind ViGGO was to create a data-to-text corpus in a new but conversational domain, and intended for use in open-domain chatbots rather than task-oriented dialogue systems. To this end, the dataset contains utterances of 9 generalizable and conversational dialogue act types, revolving around various aspects of video games. The idea is that similar, relatively small datasets could fairly easily be collected for other conversational domains -- especially other entertainment domains (such as music or books), but perhaps also topics like animals or food -- to support an open-domain conversational agent with controllable neural NLG.\n\nAnother desired quality of the ViGGO dataset was cleanliness (no typos and grammatical errors) and semantic accuracy, which has often not been the case with other crowdsourced data-to-text corpora. In general, for the data-to-text generation task, there is arguably no need to put the burden on the generation model to figure out the noise, since the noise would not be expected to be there in a real-world system whose dialogue manager that creates the input for the NLG module is usually configurable and tightly controlled.#### Communicative Goal\n\n\n\nProduce a response from a structured meaning representation in the context of a conversation about video games. It can be a brief opinion or a description of a game, as well as a request for attribute (e.g., genre, player perspective, or platform) preference/confirmation or an inquiry about liking a particular type of games.#### Sourced from Different Sources\n\n\n\nno### Language Data#### How was Language Data Obtained?\n\n\n\n'Crowdsourced'#### Where was it crowdsourced?\n\n\n\n'Amazon Mechanical Turk'#### Language Producers\n\n\n\nThe paid crowdworkers who produced the reference utterances were from English-speaking countries, and they had at least 1,000 HITs approved and a HIT approval rate of 98% or more. Furthermore, in the instructions, crowdworkers were discouraged from taking on the task unless they considered themselves a gamer.#### Topics Covered\n\n\n\nThe dataset focuses on video games and their various aspects, and hence the language of the utterances may contain video game-specific jargon.#### Data Validation\n\n\n\nvalidated by data curator"
] |
f77f1605df167e6464bfca153adc0632546cb9c1 | # Dataset Card for "training_v0.0.6-public"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | male-2/training_v0.0.6-public | [
"region:us"
] | 2024-01-07T06:35:25+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}, {"name": "emotion", "struct": [{"name": "joyful", "dtype": "bool"}, {"name": "sad", "dtype": "bool"}, {"name": "angry", "dtype": "bool"}]}, {"name": "example", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1054, "num_examples": 1}], "download_size": 10130, "dataset_size": 1054}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-16T11:55:36+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "training_v0.0.6-public"
More Information needed | [
"# Dataset Card for \"training_v0.0.6-public\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"training_v0.0.6-public\"\n\nMore Information needed"
] | [
6,
18
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"training_v0.0.6-public\"\n\nMore Information needed"
] |
bc44d6daa143b7ae8334ea15ffe0c32d74259b7c |
一只猫猫的说话语录。
更短的版本见这里:[Mxode/Meow-Instruct-12k](https://huggingface.co/datasets/Mxode/Meow-Instruct-12k) | Mxode/Meow-Instruct-34k | [
"task_categories:conversational",
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:zh",
"license:apache-2.0",
"region:us"
] | 2024-01-07T07:10:14+00:00 | {"language": ["zh"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["conversational", "text-generation"], "pretty_name": "meow-34k"} | 2024-01-09T15:15:53+00:00 | [] | [
"zh"
] | TAGS
#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-Chinese #license-apache-2.0 #region-us
|
一只猫猫的说话语录。
更短的版本见这里:Mxode/Meow-Instruct-12k | [] | [
"TAGS\n#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-Chinese #license-apache-2.0 #region-us \n"
] | [
52
] | [
"passage: TAGS\n#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-Chinese #license-apache-2.0 #region-us \n"
] |
c97ac943a441d95a5daafbb3b3f7e99158fc6ff6 |
This dataset is created by [DeepSim: deep learning code functional similarity](https://dl.acm.org/doi/10.1145/3236024.3236068).
I downloaded `googlejam4.tar.gz` from [parasol-aser/deepsim](https://github.com/parasol-aser/deepsim/), fixed encoding of `6/googlejam6.p261.Round1B.java` and `1/googlejam1.p815.MushroomMonster.java`, and re-compressed.
The `all` split (all 12 problems) is consistent with their paper (I guess...).
The `test` split (problem 5, 6, 7, 8, 12) is used in experiments of [Language Models are Universal Embedders](https://arxiv.org/pdf/2310.08232.pdf).
| problem | code num |
|---------|----------|
| 1 | 478 |
| 2 | 88 |
| 3 | 242 |
| 4 | 38 |
| 5 | 2 |
| 6 | 435 |
| 7 | 27 |
| 8 | 245 |
| 9 | 68 |
| 10 | 18 |
| 11 | 20 |
| 12 | 4 |
| izhx/google-code-jam | [
"license:mit",
"arxiv:2310.08232",
"region:us"
] | 2024-01-07T07:48:01+00:00 | {"license": "mit"} | 2024-01-07T12:22:56+00:00 | [
"2310.08232"
] | [] | TAGS
#license-mit #arxiv-2310.08232 #region-us
| This dataset is created by DeepSim: deep learning code functional similarity.
I downloaded 'URL' from parasol-aser/deepsim, fixed encoding of '6/URL' and '1/URL', and re-compressed.
The 'all' split (all 12 problems) is consistent with their paper (I guess...).
The 'test' split (problem 5, 6, 7, 8, 12) is used in experiments of Language Models are Universal Embedders.
| [] | [
"TAGS\n#license-mit #arxiv-2310.08232 #region-us \n"
] | [
20
] | [
"passage: TAGS\n#license-mit #arxiv-2310.08232 #region-us \n"
] |
5a69ec0aa7b9bce1decdad0149316efcb11b7fba | # Llama 2
We are unlocking the power of large language models. Our latest version of Llama is now accessible to individuals, creators, researchers and businesses of all sizes so that they can experiment, innovate and scale their ideas responsibly.
This release includes model weights and starting code for pretrained and fine-tuned Llama language models — ranging from 7B to 70B parameters.
This repository is intended as a minimal example to load [Llama 2](https://ai.meta.com/research/publications/llama-2-open-foundation-and-fine-tuned-chat-models/) models and run inference. For more detailed examples leveraging Hugging Face, see [llama-recipes](https://github.com/facebookresearch/llama-recipes/).
## Updates post-launch
See [UPDATES.md](UPDATES.md). Also for a running list of frequently asked questions, see [here](https://ai.meta.com/llama/faq/).
## Download
⚠️ **7/18: We're aware of people encountering a number of download issues today. Anyone still encountering issues should remove all local files, re-clone the repository, and [request a new download link](https://ai.meta.com/resources/models-and-libraries/llama-downloads/). It's critical to do all of these in case you have local corrupt files.**
In order to download the model weights and tokenizer, please visit the [Meta website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License.
Once your request is approved, you will receive a signed URL over email. Then run the download.sh script, passing the URL provided when prompted to start the download.
Pre-requisites: Make sure you have `wget` and `md5sum` installed. Then to run the script: `./download.sh`.
Keep in mind that the links expire after 24 hours and a certain amount of downloads. If you start seeing errors such as `403: Forbidden`, you can always re-request a link.
### Access on Hugging Face
We are also providing downloads on [Hugging Face](https://huggingface.co/meta-llama). You must first request a download from the Meta website using the same email address as your Hugging Face account. After doing so, you can request access to any of the models on Hugging Face and within 1-2 days your account will be granted access to all versions.
## Quick Start
You can follow the steps below to quickly get up and running with Llama 2 models. These steps will let you run quick inference locally. For more examples, see the [Llama 2 recipes repository](https://github.com/facebookresearch/llama-recipes).
1. In a conda env with PyTorch / CUDA available clone and download this repository.
2. In the top level directory run:
```bash
pip install -e .
```
3. Visit the [Meta website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and register to download the model/s.
4. Once registered, you will get an email with a URL to download the models. You will need this URL when you run the download.sh script.
5. Once you get the email, navigate to your downloaded llama repository and run the download.sh script.
- Make sure to grant execution permissions to the download.sh script
- During this process, you will be prompted to enter the URL from the email.
- Do not use the “Copy Link” option but rather make sure to manually copy the link from the email.
6. Once the model/s you want have been downloaded, you can run the model locally using the command below:
```bash
torchrun --nproc_per_node 1 example_chat_completion.py \
--ckpt_dir llama-2-7b-chat/ \
--tokenizer_path tokenizer.model \
--max_seq_len 512 --max_batch_size 6
```
**Note**
- Replace `llama-2-7b-chat/` with the path to your checkpoint directory and `tokenizer.model` with the path to your tokenizer model.
- The `–nproc_per_node` should be set to the [MP](#inference) value for the model you are using.
- Adjust the `max_seq_len` and `max_batch_size` parameters as needed.
- This example runs the [example_chat_completion.py](example_chat_completion.py) found in this repository but you can change that to a different .py file.
## Inference
Different models require different model-parallel (MP) values:
| Model | MP |
|--------|----|
| 7B | 1 |
| 13B | 2 |
| 70B | 8 |
All models support sequence length up to 4096 tokens, but we pre-allocate the cache according to `max_seq_len` and `max_batch_size` values. So set those according to your hardware.
### Pretrained Models
These models are not finetuned for chat or Q&A. They should be prompted so that the expected answer is the natural continuation of the prompt.
See `example_text_completion.py` for some examples. To illustrate, see the command below to run it with the llama-2-7b model (`nproc_per_node` needs to be set to the `MP` value):
```
torchrun --nproc_per_node 1 example_text_completion.py \
--ckpt_dir llama-2-7b/ \
--tokenizer_path tokenizer.model \
--max_seq_len 128 --max_batch_size 4
```
### Fine-tuned Chat Models
The fine-tuned models were trained for dialogue applications. To get the expected features and performance for them, a specific formatting defined in [`chat_completion`](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L212)
needs to be followed, including the `INST` and `<<SYS>>` tags, `BOS` and `EOS` tokens, and the whitespaces and breaklines in between (we recommend calling `strip()` on inputs to avoid double-spaces).
You can also deploy additional classifiers for filtering out inputs and outputs that are deemed unsafe. See the llama-recipes repo for [an example](https://github.com/facebookresearch/llama-recipes/blob/main/inference/inference.py) of how to add a safety checker to the inputs and outputs of your inference code.
Examples using llama-2-7b-chat:
```
torchrun --nproc_per_node 1 example_chat_completion.py \
--ckpt_dir llama-2-7b-chat/ \
--tokenizer_path tokenizer.model \
--max_seq_len 512 --max_batch_size 6
```
Llama 2 is a new technology that carries potential risks with use. Testing conducted to date has not — and could not — cover all scenarios.
In order to help developers address these risks, we have created the [Responsible Use Guide](Responsible-Use-Guide.pdf). More details can be found in our research paper as well.
## Issues
Please report any software “bug”, or other problems with the models through one of the following means:
- Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama)
- Reporting risky content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)
- Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)
## Model Card
See [MODEL_CARD.md](MODEL_CARD.md).
## License
Our model and weights are licensed for both researchers and commercial entities, upholding the principles of openness. Our mission is to empower individuals, and industry through this opportunity, while fostering an environment of discovery and ethical AI advancements.
See the [LICENSE](LICENSE) file, as well as our accompanying [Acceptable Use Policy](USE_POLICY.md)
## References
1. [Research Paper](https://ai.meta.com/research/publications/llama-2-open-foundation-and-fine-tuned-chat-models/)
2. [Llama 2 technical overview](https://ai.meta.com/resources/models-and-libraries/llama)
3. [Open Innovation AI Research Community](https://ai.meta.com/llama/open-innovation-ai-research-community/)
For common questions, the FAQ can be found [here](https://ai.meta.com/llama/faq/) which will be kept up to date over time as new questions arise.
## Original LLaMA
The repo for the original llama release is in the [`llama_v1`](https://github.com/facebookresearch/llama/tree/llama_v1) branch.
| SofiaHussain/big_data | [
"region:us"
] | 2024-01-07T07:48:49+00:00 | {} | 2024-01-07T08:23:03+00:00 | [] | [] | TAGS
#region-us
| Llama 2
=======
We are unlocking the power of large language models. Our latest version of Llama is now accessible to individuals, creators, researchers and businesses of all sizes so that they can experiment, innovate and scale their ideas responsibly.
This release includes model weights and starting code for pretrained and fine-tuned Llama language models — ranging from 7B to 70B parameters.
This repository is intended as a minimal example to load Llama 2 models and run inference. For more detailed examples leveraging Hugging Face, see llama-recipes.
Updates post-launch
-------------------
See URL. Also for a running list of frequently asked questions, see here.
Download
--------
️ 7/18: We're aware of people encountering a number of download issues today. Anyone still encountering issues should remove all local files, re-clone the repository, and request a new download link. It's critical to do all of these in case you have local corrupt files.
In order to download the model weights and tokenizer, please visit the Meta website and accept our License.
Once your request is approved, you will receive a signed URL over email. Then run the URL script, passing the URL provided when prompted to start the download.
Pre-requisites: Make sure you have 'wget' and 'md5sum' installed. Then to run the script: './URL'.
Keep in mind that the links expire after 24 hours and a certain amount of downloads. If you start seeing errors such as '403: Forbidden', you can always re-request a link.
### Access on Hugging Face
We are also providing downloads on Hugging Face. You must first request a download from the Meta website using the same email address as your Hugging Face account. After doing so, you can request access to any of the models on Hugging Face and within 1-2 days your account will be granted access to all versions.
Quick Start
-----------
You can follow the steps below to quickly get up and running with Llama 2 models. These steps will let you run quick inference locally. For more examples, see the Llama 2 recipes repository.
1. In a conda env with PyTorch / CUDA available clone and download this repository.
2. In the top level directory run:
3. Visit the Meta website and register to download the model/s.
4. Once registered, you will get an email with a URL to download the models. You will need this URL when you run the URL script.
5. Once you get the email, navigate to your downloaded llama repository and run the URL script.
* Make sure to grant execution permissions to the URL script
* During this process, you will be prompted to enter the URL from the email.
* Do not use the “Copy Link” option but rather make sure to manually copy the link from the email.
6. Once the model/s you want have been downloaded, you can run the model locally using the command below:
Note
* Replace 'llama-2-7b-chat/' with the path to your checkpoint directory and 'URL' with the path to your tokenizer model.
* The '–nproc\_per\_node' should be set to the MP value for the model you are using.
* Adjust the 'max\_seq\_len' and 'max\_batch\_size' parameters as needed.
* This example runs the example\_chat\_completion.py found in this repository but you can change that to a different .py file.
Inference
---------
Different models require different model-parallel (MP) values:
All models support sequence length up to 4096 tokens, but we pre-allocate the cache according to 'max\_seq\_len' and 'max\_batch\_size' values. So set those according to your hardware.
### Pretrained Models
These models are not finetuned for chat or Q&A. They should be prompted so that the expected answer is the natural continuation of the prompt.
See 'example\_text\_completion.py' for some examples. To illustrate, see the command below to run it with the llama-2-7b model ('nproc\_per\_node' needs to be set to the 'MP' value):
### Fine-tuned Chat Models
The fine-tuned models were trained for dialogue applications. To get the expected features and performance for them, a specific formatting defined in 'chat\_completion'
needs to be followed, including the 'INST' and '<>' tags, 'BOS' and 'EOS' tokens, and the whitespaces and breaklines in between (we recommend calling 'strip()' on inputs to avoid double-spaces).
You can also deploy additional classifiers for filtering out inputs and outputs that are deemed unsafe. See the llama-recipes repo for an example of how to add a safety checker to the inputs and outputs of your inference code.
Examples using llama-2-7b-chat:
Llama 2 is a new technology that carries potential risks with use. Testing conducted to date has not — and could not — cover all scenarios.
In order to help developers address these risks, we have created the Responsible Use Guide. More details can be found in our research paper as well.
Issues
------
Please report any software “bug”, or other problems with the models through one of the following means:
* Reporting issues with the model: URL
* Reporting risky content generated by the model: URL
* Reporting bugs and security concerns: URL
Model Card
----------
See MODEL\_CARD.md.
License
-------
Our model and weights are licensed for both researchers and commercial entities, upholding the principles of openness. Our mission is to empower individuals, and industry through this opportunity, while fostering an environment of discovery and ethical AI advancements.
See the LICENSE file, as well as our accompanying Acceptable Use Policy
References
----------
1. Research Paper
2. Llama 2 technical overview
3. Open Innovation AI Research Community
For common questions, the FAQ can be found here which will be kept up to date over time as new questions arise.
Original LLaMA
--------------
The repo for the original llama release is in the 'llama\_v1' branch.
| [
"### Access on Hugging Face\n\n\nWe are also providing downloads on Hugging Face. You must first request a download from the Meta website using the same email address as your Hugging Face account. After doing so, you can request access to any of the models on Hugging Face and within 1-2 days your account will be granted access to all versions.\n\n\nQuick Start\n-----------\n\n\nYou can follow the steps below to quickly get up and running with Llama 2 models. These steps will let you run quick inference locally. For more examples, see the Llama 2 recipes repository.\n\n\n1. In a conda env with PyTorch / CUDA available clone and download this repository.\n2. In the top level directory run:\n3. Visit the Meta website and register to download the model/s.\n4. Once registered, you will get an email with a URL to download the models. You will need this URL when you run the URL script.\n5. Once you get the email, navigate to your downloaded llama repository and run the URL script.\n\n\n\t* Make sure to grant execution permissions to the URL script\n\t* During this process, you will be prompted to enter the URL from the email.\n\t* Do not use the “Copy Link” option but rather make sure to manually copy the link from the email.\n6. Once the model/s you want have been downloaded, you can run the model locally using the command below:\n\n\nNote\n\n\n* Replace 'llama-2-7b-chat/' with the path to your checkpoint directory and 'URL' with the path to your tokenizer model.\n* The '–nproc\\_per\\_node' should be set to the MP value for the model you are using.\n* Adjust the 'max\\_seq\\_len' and 'max\\_batch\\_size' parameters as needed.\n* This example runs the example\\_chat\\_completion.py found in this repository but you can change that to a different .py file.\n\n\nInference\n---------\n\n\nDifferent models require different model-parallel (MP) values:\n\n\n\nAll models support sequence length up to 4096 tokens, but we pre-allocate the cache according to 'max\\_seq\\_len' and 'max\\_batch\\_size' values. So set those according to your hardware.",
"### Pretrained Models\n\n\nThese models are not finetuned for chat or Q&A. They should be prompted so that the expected answer is the natural continuation of the prompt.\n\n\nSee 'example\\_text\\_completion.py' for some examples. To illustrate, see the command below to run it with the llama-2-7b model ('nproc\\_per\\_node' needs to be set to the 'MP' value):",
"### Fine-tuned Chat Models\n\n\nThe fine-tuned models were trained for dialogue applications. To get the expected features and performance for them, a specific formatting defined in 'chat\\_completion'\nneeds to be followed, including the 'INST' and '<>' tags, 'BOS' and 'EOS' tokens, and the whitespaces and breaklines in between (we recommend calling 'strip()' on inputs to avoid double-spaces).\n\n\nYou can also deploy additional classifiers for filtering out inputs and outputs that are deemed unsafe. See the llama-recipes repo for an example of how to add a safety checker to the inputs and outputs of your inference code.\n\n\nExamples using llama-2-7b-chat:\n\n\nLlama 2 is a new technology that carries potential risks with use. Testing conducted to date has not — and could not — cover all scenarios.\nIn order to help developers address these risks, we have created the Responsible Use Guide. More details can be found in our research paper as well.\n\n\nIssues\n------\n\n\nPlease report any software “bug”, or other problems with the models through one of the following means:\n\n\n* Reporting issues with the model: URL\n* Reporting risky content generated by the model: URL\n* Reporting bugs and security concerns: URL\n\n\nModel Card\n----------\n\n\nSee MODEL\\_CARD.md.\n\n\nLicense\n-------\n\n\nOur model and weights are licensed for both researchers and commercial entities, upholding the principles of openness. Our mission is to empower individuals, and industry through this opportunity, while fostering an environment of discovery and ethical AI advancements.\n\n\nSee the LICENSE file, as well as our accompanying Acceptable Use Policy\n\n\nReferences\n----------\n\n\n1. Research Paper\n2. Llama 2 technical overview\n3. Open Innovation AI Research Community\n\n\nFor common questions, the FAQ can be found here which will be kept up to date over time as new questions arise.\n\n\nOriginal LLaMA\n--------------\n\n\nThe repo for the original llama release is in the 'llama\\_v1' branch."
] | [
"TAGS\n#region-us \n",
"### Access on Hugging Face\n\n\nWe are also providing downloads on Hugging Face. You must first request a download from the Meta website using the same email address as your Hugging Face account. After doing so, you can request access to any of the models on Hugging Face and within 1-2 days your account will be granted access to all versions.\n\n\nQuick Start\n-----------\n\n\nYou can follow the steps below to quickly get up and running with Llama 2 models. These steps will let you run quick inference locally. For more examples, see the Llama 2 recipes repository.\n\n\n1. In a conda env with PyTorch / CUDA available clone and download this repository.\n2. In the top level directory run:\n3. Visit the Meta website and register to download the model/s.\n4. Once registered, you will get an email with a URL to download the models. You will need this URL when you run the URL script.\n5. Once you get the email, navigate to your downloaded llama repository and run the URL script.\n\n\n\t* Make sure to grant execution permissions to the URL script\n\t* During this process, you will be prompted to enter the URL from the email.\n\t* Do not use the “Copy Link” option but rather make sure to manually copy the link from the email.\n6. Once the model/s you want have been downloaded, you can run the model locally using the command below:\n\n\nNote\n\n\n* Replace 'llama-2-7b-chat/' with the path to your checkpoint directory and 'URL' with the path to your tokenizer model.\n* The '–nproc\\_per\\_node' should be set to the MP value for the model you are using.\n* Adjust the 'max\\_seq\\_len' and 'max\\_batch\\_size' parameters as needed.\n* This example runs the example\\_chat\\_completion.py found in this repository but you can change that to a different .py file.\n\n\nInference\n---------\n\n\nDifferent models require different model-parallel (MP) values:\n\n\n\nAll models support sequence length up to 4096 tokens, but we pre-allocate the cache according to 'max\\_seq\\_len' and 'max\\_batch\\_size' values. So set those according to your hardware.",
"### Pretrained Models\n\n\nThese models are not finetuned for chat or Q&A. They should be prompted so that the expected answer is the natural continuation of the prompt.\n\n\nSee 'example\\_text\\_completion.py' for some examples. To illustrate, see the command below to run it with the llama-2-7b model ('nproc\\_per\\_node' needs to be set to the 'MP' value):",
"### Fine-tuned Chat Models\n\n\nThe fine-tuned models were trained for dialogue applications. To get the expected features and performance for them, a specific formatting defined in 'chat\\_completion'\nneeds to be followed, including the 'INST' and '<>' tags, 'BOS' and 'EOS' tokens, and the whitespaces and breaklines in between (we recommend calling 'strip()' on inputs to avoid double-spaces).\n\n\nYou can also deploy additional classifiers for filtering out inputs and outputs that are deemed unsafe. See the llama-recipes repo for an example of how to add a safety checker to the inputs and outputs of your inference code.\n\n\nExamples using llama-2-7b-chat:\n\n\nLlama 2 is a new technology that carries potential risks with use. Testing conducted to date has not — and could not — cover all scenarios.\nIn order to help developers address these risks, we have created the Responsible Use Guide. More details can be found in our research paper as well.\n\n\nIssues\n------\n\n\nPlease report any software “bug”, or other problems with the models through one of the following means:\n\n\n* Reporting issues with the model: URL\n* Reporting risky content generated by the model: URL\n* Reporting bugs and security concerns: URL\n\n\nModel Card\n----------\n\n\nSee MODEL\\_CARD.md.\n\n\nLicense\n-------\n\n\nOur model and weights are licensed for both researchers and commercial entities, upholding the principles of openness. Our mission is to empower individuals, and industry through this opportunity, while fostering an environment of discovery and ethical AI advancements.\n\n\nSee the LICENSE file, as well as our accompanying Acceptable Use Policy\n\n\nReferences\n----------\n\n\n1. Research Paper\n2. Llama 2 technical overview\n3. Open Innovation AI Research Community\n\n\nFor common questions, the FAQ can be found here which will be kept up to date over time as new questions arise.\n\n\nOriginal LLaMA\n--------------\n\n\nThe repo for the original llama release is in the 'llama\\_v1' branch."
] | [
6,
507,
102,
463
] | [
"passage: TAGS\n#region-us \n",
"passage: ### Access on Hugging Face\n\n\nWe are also providing downloads on Hugging Face. You must first request a download from the Meta website using the same email address as your Hugging Face account. After doing so, you can request access to any of the models on Hugging Face and within 1-2 days your account will be granted access to all versions.\n\n\nQuick Start\n-----------\n\n\nYou can follow the steps below to quickly get up and running with Llama 2 models. These steps will let you run quick inference locally. For more examples, see the Llama 2 recipes repository.\n\n\n1. In a conda env with PyTorch / CUDA available clone and download this repository.\n2. In the top level directory run:\n3. Visit the Meta website and register to download the model/s.\n4. Once registered, you will get an email with a URL to download the models. You will need this URL when you run the URL script.\n5. Once you get the email, navigate to your downloaded llama repository and run the URL script.\n\n\n\t* Make sure to grant execution permissions to the URL script\n\t* During this process, you will be prompted to enter the URL from the email.\n\t* Do not use the “Copy Link” option but rather make sure to manually copy the link from the email.\n6. Once the model/s you want have been downloaded, you can run the model locally using the command below:\n\n\nNote\n\n\n* Replace 'llama-2-7b-chat/' with the path to your checkpoint directory and 'URL' with the path to your tokenizer model.\n* The '–nproc\\_per\\_node' should be set to the MP value for the model you are using.\n* Adjust the 'max\\_seq\\_len' and 'max\\_batch\\_size' parameters as needed.\n* This example runs the example\\_chat\\_completion.py found in this repository but you can change that to a different .py file.\n\n\nInference\n---------\n\n\nDifferent models require different model-parallel (MP) values:\n\n\n\nAll models support sequence length up to 4096 tokens, but we pre-allocate the cache according to 'max\\_seq\\_len' and 'max\\_batch\\_size' values. So set those according to your hardware.### Pretrained Models\n\n\nThese models are not finetuned for chat or Q&A. They should be prompted so that the expected answer is the natural continuation of the prompt.\n\n\nSee 'example\\_text\\_completion.py' for some examples. To illustrate, see the command below to run it with the llama-2-7b model ('nproc\\_per\\_node' needs to be set to the 'MP' value):"
] |
83886c8cb9d116adb8565a25cc069aad26a70a3c | # Dataset Card for Python Q/A pair
<!-- Provide a quick summary of the dataset. -->
This dataset card provides information about the Python Q/A pair dataset.
## Dataset Details
### Dataset Description
The Python Q/A pair dataset is a preprocessed version of a Python Q/A dataset from StackOverflow, which was originally hosted on Kaggle. The dataset contains high-ranked questions and their corresponding high-ranked answers, sorted from high to low rank.
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
This dataset can be used for tasks such as question answering, text generation, and conversational AI research and development.
[More Information Needed]
### Out-of-Scope Use
This dataset should not be used for tasks outside of natural language processing, such as image recognition or voice recognition.
[More Information Needed]
## Dataset Structure
The dataset contains 100k rows of high-ranked questions and their corresponding high-ranked answers from StackOverflow.
[More Information Needed]
## Dataset Creation
### Curation Rationale
The dataset was curated to provide a resource for developing and testing natural language processing models, particularly in the domain of question answering and text generation.
[More Information Needed]
### Source Data
The data in this dataset comes from StackOverflow Q/A pairs that were ranked 1 or above. The raw form of this dataset is hosted on Kaggle.
#### Data Collection and Processing
The data was collected from StackOverflow and preprocessed to include only high-ranked questions and their corresponding high-ranked answers.
[More Information Needed]
#### Who are the source data producers?
The source data was produced by users of StackOverflow.
[More Information Needed]
### Annotations [optional]
This dataset does not contain any additional annotations.
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
The dataset does not contain any personal or sensitive information as it was derived from publicly available data on StackOverflow.
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | HassanSamo/Python-Q_A | [
"task_categories:question-answering",
"task_categories:text-generation",
"task_categories:conversational",
"size_categories:n<1K",
"language:en",
"license:apache-2.0",
"python",
"region:us"
] | 2024-01-07T08:27:06+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["n<1K"], "task_categories": ["question-answering", "text-generation", "conversational"], "pretty_name": "StackOverflow's Python Question-Answering Pair Dataset", "tags": ["python"]} | 2024-01-24T18:39:47+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #task_categories-text-generation #task_categories-conversational #size_categories-n<1K #language-English #license-apache-2.0 #python #region-us
| # Dataset Card for Python Q/A pair
This dataset card provides information about the Python Q/A pair dataset.
## Dataset Details
### Dataset Description
The Python Q/A pair dataset is a preprocessed version of a Python Q/A dataset from StackOverflow, which was originally hosted on Kaggle. The dataset contains high-ranked questions and their corresponding high-ranked answers, sorted from high to low rank.
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
This dataset can be used for tasks such as question answering, text generation, and conversational AI research and development.
### Out-of-Scope Use
This dataset should not be used for tasks outside of natural language processing, such as image recognition or voice recognition.
## Dataset Structure
The dataset contains 100k rows of high-ranked questions and their corresponding high-ranked answers from StackOverflow.
## Dataset Creation
### Curation Rationale
The dataset was curated to provide a resource for developing and testing natural language processing models, particularly in the domain of question answering and text generation.
### Source Data
The data in this dataset comes from StackOverflow Q/A pairs that were ranked 1 or above. The raw form of this dataset is hosted on Kaggle.
#### Data Collection and Processing
The data was collected from StackOverflow and preprocessed to include only high-ranked questions and their corresponding high-ranked answers.
#### Who are the source data producers?
The source data was produced by users of StackOverflow.
### Annotations [optional]
This dataset does not contain any additional annotations.
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
The dataset does not contain any personal or sensitive information as it was derived from publicly available data on StackOverflow.
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Python Q/A pair\n\n\n\nThis dataset card provides information about the Python Q/A pair dataset.",
"## Dataset Details",
"### Dataset Description\n\nThe Python Q/A pair dataset is a preprocessed version of a Python Q/A dataset from StackOverflow, which was originally hosted on Kaggle. The dataset contains high-ranked questions and their corresponding high-ranked answers, sorted from high to low rank.\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use\n\nThis dataset can be used for tasks such as question answering, text generation, and conversational AI research and development.",
"### Out-of-Scope Use\n\nThis dataset should not be used for tasks outside of natural language processing, such as image recognition or voice recognition.",
"## Dataset Structure\n\nThe dataset contains 100k rows of high-ranked questions and their corresponding high-ranked answers from StackOverflow.",
"## Dataset Creation",
"### Curation Rationale\n\nThe dataset was curated to provide a resource for developing and testing natural language processing models, particularly in the domain of question answering and text generation.",
"### Source Data\n\nThe data in this dataset comes from StackOverflow Q/A pairs that were ranked 1 or above. The raw form of this dataset is hosted on Kaggle.",
"#### Data Collection and Processing\n\nThe data was collected from StackOverflow and preprocessed to include only high-ranked questions and their corresponding high-ranked answers.",
"#### Who are the source data producers?\n\nThe source data was produced by users of StackOverflow.",
"### Annotations [optional]\n\nThis dataset does not contain any additional annotations.",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information\n\nThe dataset does not contain any personal or sensitive information as it was derived from publicly available data on StackOverflow.",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#task_categories-question-answering #task_categories-text-generation #task_categories-conversational #size_categories-n<1K #language-English #license-apache-2.0 #python #region-us \n",
"# Dataset Card for Python Q/A pair\n\n\n\nThis dataset card provides information about the Python Q/A pair dataset.",
"## Dataset Details",
"### Dataset Description\n\nThe Python Q/A pair dataset is a preprocessed version of a Python Q/A dataset from StackOverflow, which was originally hosted on Kaggle. The dataset contains high-ranked questions and their corresponding high-ranked answers, sorted from high to low rank.\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use\n\nThis dataset can be used for tasks such as question answering, text generation, and conversational AI research and development.",
"### Out-of-Scope Use\n\nThis dataset should not be used for tasks outside of natural language processing, such as image recognition or voice recognition.",
"## Dataset Structure\n\nThe dataset contains 100k rows of high-ranked questions and their corresponding high-ranked answers from StackOverflow.",
"## Dataset Creation",
"### Curation Rationale\n\nThe dataset was curated to provide a resource for developing and testing natural language processing models, particularly in the domain of question answering and text generation.",
"### Source Data\n\nThe data in this dataset comes from StackOverflow Q/A pairs that were ranked 1 or above. The raw form of this dataset is hosted on Kaggle.",
"#### Data Collection and Processing\n\nThe data was collected from StackOverflow and preprocessed to include only high-ranked questions and their corresponding high-ranked answers.",
"#### Who are the source data producers?\n\nThe source data was produced by users of StackOverflow.",
"### Annotations [optional]\n\nThis dataset does not contain any additional annotations.",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information\n\nThe dataset does not contain any personal or sensitive information as it was derived from publicly available data on StackOverflow.",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
64,
26,
4,
107,
29,
3,
30,
34,
38,
5,
39,
43,
40,
23,
21,
5,
9,
35,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#task_categories-question-answering #task_categories-text-generation #task_categories-conversational #size_categories-n<1K #language-English #license-apache-2.0 #python #region-us \n# Dataset Card for Python Q/A pair\n\n\n\nThis dataset card provides information about the Python Q/A pair dataset.## Dataset Details### Dataset Description\n\nThe Python Q/A pair dataset is a preprocessed version of a Python Q/A dataset from StackOverflow, which was originally hosted on Kaggle. The dataset contains high-ranked questions and their corresponding high-ranked answers, sorted from high to low rank.\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use\n\nThis dataset can be used for tasks such as question answering, text generation, and conversational AI research and development.### Out-of-Scope Use\n\nThis dataset should not be used for tasks outside of natural language processing, such as image recognition or voice recognition.## Dataset Structure\n\nThe dataset contains 100k rows of high-ranked questions and their corresponding high-ranked answers from StackOverflow.## Dataset Creation### Curation Rationale\n\nThe dataset was curated to provide a resource for developing and testing natural language processing models, particularly in the domain of question answering and text generation.### Source Data\n\nThe data in this dataset comes from StackOverflow Q/A pairs that were ranked 1 or above. The raw form of this dataset is hosted on Kaggle.#### Data Collection and Processing\n\nThe data was collected from StackOverflow and preprocessed to include only high-ranked questions and their corresponding high-ranked answers.#### Who are the source data producers?\n\nThe source data was produced by users of StackOverflow.### Annotations [optional]\n\nThis dataset does not contain any additional annotations."
] |
af7d6bbc3aa8dd3ed7ce9bd673f4a164d8d10770 | <p align="center"><h1>🧠 Awesome ChatGPT Prompts [CSV dataset]</h1></p>
This is a Dataset Repository of **Awesome ChatGPT Prompts**
**[View All Prompts on GitHub](https://github.com/f/awesome-chatgpt-prompts)**
# License
CC-0
| IQRA512/gpt_prompts.csv | [
"license:cc0-1.0",
"ChatGPT",
"region:us"
] | 2024-01-07T09:31:56+00:00 | {"license": "cc0-1.0", "tags": ["ChatGPT"]} | 2024-01-07T17:43:48+00:00 | [] | [] | TAGS
#license-cc0-1.0 #ChatGPT #region-us
| <p align="center"><h1> Awesome ChatGPT Prompts [CSV dataset]</h1></p>
This is a Dataset Repository of Awesome ChatGPT Prompts
View All Prompts on GitHub
# License
CC-0
| [
"# License\n\nCC-0"
] | [
"TAGS\n#license-cc0-1.0 #ChatGPT #region-us \n",
"# License\n\nCC-0"
] | [
18,
4
] | [
"passage: TAGS\n#license-cc0-1.0 #ChatGPT #region-us \n# License\n\nCC-0"
] |
7b2108a7d59f4de2022c881c9e68646257f90dd0 |
A RL environment called ItemSortingCart for the Godot Game Engine.
This environment was created with: https://github.com/edbeeching/godot_rl_agents
## Downloading the environment
After installing Godot RL Agents, download the environment with:
```
gdrl.env_from_hub -r edbeeching/godot_rl_ItemSortingCart
```
| edbeeching/godot_rl_ItemSortingCart | [
"deep-reinforcement-learning",
"reinforcement-learning",
"godot-rl",
"environments",
"video-games",
"region:us"
] | 2024-01-07T09:47:39+00:00 | {"library_name": "godot-rl", "tags": ["deep-reinforcement-learning", "reinforcement-learning", "godot-rl", "environments", "video-games"]} | 2024-01-07T12:48:34+00:00 | [] | [] | TAGS
#deep-reinforcement-learning #reinforcement-learning #godot-rl #environments #video-games #region-us
|
A RL environment called ItemSortingCart for the Godot Game Engine.
This environment was created with: URL
## Downloading the environment
After installing Godot RL Agents, download the environment with:
| [
"## Downloading the environment \n\nAfter installing Godot RL Agents, download the environment with:"
] | [
"TAGS\n#deep-reinforcement-learning #reinforcement-learning #godot-rl #environments #video-games #region-us \n",
"## Downloading the environment \n\nAfter installing Godot RL Agents, download the environment with:"
] | [
32,
20
] | [
"passage: TAGS\n#deep-reinforcement-learning #reinforcement-learning #godot-rl #environments #video-games #region-us \n## Downloading the environment \n\nAfter installing Godot RL Agents, download the environment with:"
] |
52f1e525175bf86ba405c2d587c1a06affbaaf5f | # Dataset Card for "sven"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | euisuh15/sven | [
"region:us"
] | 2024-01-07T10:02:49+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "func_name", "dtype": "string"}, {"name": "code", "dtype": "string"}, {"name": "vul_type", "dtype": "string"}, {"name": "line_changes", "dtype": "string"}, {"name": "char_changes", "dtype": "string"}, {"name": "is_vulnerable", "dtype": "bool"}, {"name": "vul_type_name", "dtype": "string"}, {"name": "vul_type_description", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1960462, "num_examples": 688}, {"name": "test", "num_bytes": 183840, "num_examples": 76}], "download_size": 383586, "dataset_size": 2144302}} | 2024-01-07T11:41:24+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "sven"
More Information needed | [
"# Dataset Card for \"sven\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"sven\"\n\nMore Information needed"
] | [
6,
12
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"sven\"\n\nMore Information needed"
] |
0aa0cf1ab57ec00441ef21867ba29cb7b71ee4d4 | # Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | ichsanbhrd/gambar_eroc | [
"task_categories:feature-extraction",
"task_categories:text-generation",
"task_categories:translation",
"task_categories:fill-mask",
"task_categories:sentence-similarity",
"size_categories:n>1T",
"art",
"climate",
"region:us"
] | 2024-01-07T10:05:57+00:00 | {"size_categories": ["n>1T"], "task_categories": ["feature-extraction", "text-generation", "translation", "fill-mask", "sentence-similarity"], "pretty_name": "sweet", "tags": ["art", "climate"]} | 2024-01-07T10:21:12+00:00 | [] | [] | TAGS
#task_categories-feature-extraction #task_categories-text-generation #task_categories-translation #task_categories-fill-mask #task_categories-sentence-similarity #size_categories-n>1T #art #climate #region-us
| # Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#task_categories-feature-extraction #task_categories-text-generation #task_categories-translation #task_categories-fill-mask #task_categories-sentence-similarity #size_categories-n>1T #art #climate #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
78,
34,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#task_categories-feature-extraction #task_categories-text-generation #task_categories-translation #task_categories-fill-mask #task_categories-sentence-similarity #size_categories-n>1T #art #climate #region-us \n# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
4f2dcc968fb0a65e66d3f4bb999e513342bea2f3 |
Extract 3.3M tryon images from [laion_text_debiased_100M](https://huggingface.co/datasets/linyq/laion_text_debiased_100M) dataset. (imagesize > 512) | zxbsmk/tryon_3m | [
"task_categories:text-to-image",
"size_categories:1M<n<10M",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-01-07T10:24:26+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1M<n<10M"], "task_categories": ["text-to-image"]} | 2024-01-16T17:28:57+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-to-image #size_categories-1M<n<10M #language-English #license-apache-2.0 #region-us
|
Extract 3.3M tryon images from laion_text_debiased_100M dataset. (imagesize > 512) | [] | [
"TAGS\n#task_categories-text-to-image #size_categories-1M<n<10M #language-English #license-apache-2.0 #region-us \n"
] | [
42
] | [
"passage: TAGS\n#task_categories-text-to-image #size_categories-1M<n<10M #language-English #license-apache-2.0 #region-us \n"
] |
8ceee1cb00dcdbf2a2864c00850c93412674ecb0 |
# Dataset Card for Evaluation run of xaviviro/FLOR-6.3B-xat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [xaviviro/FLOR-6.3B-xat](https://huggingface.co/xaviviro/FLOR-6.3B-xat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xaviviro__FLOR-6.3B-xat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-07T11:25:36.448416](https://huggingface.co/datasets/open-llm-leaderboard/details_xaviviro__FLOR-6.3B-xat/blob/main/results_2024-01-07T11-25-36.448416.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2716155788779588,
"acc_stderr": 0.031058724787907607,
"acc_norm": 0.27377952221352536,
"acc_norm_stderr": 0.03188339765325689,
"mc1": 0.21297429620563035,
"mc1_stderr": 0.014332203787059683,
"mc2": 0.3796327664824807,
"mc2_stderr": 0.013938529246022055
},
"harness|arc:challenge|25": {
"acc": 0.3455631399317406,
"acc_stderr": 0.013896938461145692,
"acc_norm": 0.386518771331058,
"acc_norm_stderr": 0.014230084761910474
},
"harness|hellaswag|10": {
"acc": 0.4698267277434774,
"acc_stderr": 0.0049806874674861,
"acc_norm": 0.6376219876518622,
"acc_norm_stderr": 0.004797048154893968
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.26973684210526316,
"acc_stderr": 0.036117805602848975,
"acc_norm": 0.26973684210526316,
"acc_norm_stderr": 0.036117805602848975
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.30566037735849055,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.30566037735849055,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641143,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641143
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28936170212765955,
"acc_stderr": 0.02964400657700962,
"acc_norm": 0.28936170212765955,
"acc_norm_stderr": 0.02964400657700962
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.0220190800122179,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.0220190800122179
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.03455071019102147,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.03455071019102147
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3,
"acc_stderr": 0.02606936229533514,
"acc_norm": 0.3,
"acc_norm_stderr": 0.02606936229533514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.031785297106427496,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.031785297106427496
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2878787878787879,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.2878787878787879,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36153846153846153,
"acc_stderr": 0.024359581465396987,
"acc_norm": 0.36153846153846153,
"acc_norm_stderr": 0.024359581465396987
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.027722065493361283,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.027722065493361283
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696545,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696545
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3486238532110092,
"acc_stderr": 0.020431254090714328,
"acc_norm": 0.3486238532110092,
"acc_norm_stderr": 0.020431254090714328
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695053,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695053
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.16591928251121077,
"acc_stderr": 0.024967553196547133,
"acc_norm": 0.16591928251121077,
"acc_norm_stderr": 0.024967553196547133
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252628,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.4174757281553398,
"acc_stderr": 0.04882840548212237,
"acc_norm": 0.4174757281553398,
"acc_norm_stderr": 0.04882840548212237
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.21367521367521367,
"acc_stderr": 0.02685345037700915,
"acc_norm": 0.21367521367521367,
"acc_norm_stderr": 0.02685345037700915
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2541507024265645,
"acc_stderr": 0.015569254692045793,
"acc_norm": 0.2541507024265645,
"acc_norm_stderr": 0.015569254692045793
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2765273311897106,
"acc_stderr": 0.025403832978179604,
"acc_norm": 0.2765273311897106,
"acc_norm_stderr": 0.025403832978179604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.023016705640262206,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.023016705640262206
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2198581560283688,
"acc_stderr": 0.02470614107070547,
"acc_norm": 0.2198581560283688,
"acc_norm_stderr": 0.02470614107070547
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24902216427640156,
"acc_stderr": 0.01104489226404077,
"acc_norm": 0.24902216427640156,
"acc_norm_stderr": 0.01104489226404077
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329376,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329376
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.016819028375736386,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.016819028375736386
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3306122448979592,
"acc_stderr": 0.030116426296540613,
"acc_norm": 0.3306122448979592,
"acc_norm_stderr": 0.030116426296540613
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21890547263681592,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.21890547263681592,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25301204819277107,
"acc_stderr": 0.03384429155233135,
"acc_norm": 0.25301204819277107,
"acc_norm_stderr": 0.03384429155233135
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03188578017686399,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03188578017686399
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21297429620563035,
"mc1_stderr": 0.014332203787059683,
"mc2": 0.3796327664824807,
"mc2_stderr": 0.013938529246022055
},
"harness|winogrande|5": {
"acc": 0.6243093922651933,
"acc_stderr": 0.013611257508380438
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_xaviviro__FLOR-6.3B-xat | [
"region:us"
] | 2024-01-07T11:28:10+00:00 | {"pretty_name": "Evaluation run of xaviviro/FLOR-6.3B-xat", "dataset_summary": "Dataset automatically created during the evaluation run of model [xaviviro/FLOR-6.3B-xat](https://huggingface.co/xaviviro/FLOR-6.3B-xat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xaviviro__FLOR-6.3B-xat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-07T11:25:36.448416](https://huggingface.co/datasets/open-llm-leaderboard/details_xaviviro__FLOR-6.3B-xat/blob/main/results_2024-01-07T11-25-36.448416.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2716155788779588,\n \"acc_stderr\": 0.031058724787907607,\n \"acc_norm\": 0.27377952221352536,\n \"acc_norm_stderr\": 0.03188339765325689,\n \"mc1\": 0.21297429620563035,\n \"mc1_stderr\": 0.014332203787059683,\n \"mc2\": 0.3796327664824807,\n \"mc2_stderr\": 0.013938529246022055\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3455631399317406,\n \"acc_stderr\": 0.013896938461145692,\n \"acc_norm\": 0.386518771331058,\n \"acc_norm_stderr\": 0.014230084761910474\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4698267277434774,\n \"acc_stderr\": 0.0049806874674861,\n \"acc_norm\": 0.6376219876518622,\n \"acc_norm_stderr\": 0.004797048154893968\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.26973684210526316,\n \"acc_stderr\": 0.036117805602848975,\n \"acc_norm\": 0.26973684210526316,\n \"acc_norm_stderr\": 0.036117805602848975\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.30566037735849055,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.30566037735849055,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.03186209851641143,\n \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.03186209851641143\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.28936170212765955,\n \"acc_stderr\": 0.02964400657700962,\n \"acc_norm\": 0.28936170212765955,\n \"acc_norm_stderr\": 0.02964400657700962\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.0220190800122179,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.0220190800122179\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n \"acc_stderr\": 0.03455071019102147,\n \"acc_norm\": 0.18253968253968253,\n \"acc_norm_stderr\": 0.03455071019102147\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.02606936229533514,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.02606936229533514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.031785297106427496,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.031785297106427496\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2878787878787879,\n \"acc_stderr\": 0.03225883512300992,\n \"acc_norm\": 0.2878787878787879,\n \"acc_norm_stderr\": 0.03225883512300992\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.36153846153846153,\n \"acc_stderr\": 0.024359581465396987,\n \"acc_norm\": 0.36153846153846153,\n \"acc_norm_stderr\": 0.024359581465396987\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.027722065493361283,\n \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.027722065493361283\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.036313298039696545,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696545\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328,\n \"acc_norm\": 0.3486238532110092,\n \"acc_norm_stderr\": 0.020431254090714328\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695053,\n \"acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695053\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.16591928251121077,\n \"acc_stderr\": 0.024967553196547133,\n \"acc_norm\": 0.16591928251121077,\n \"acc_norm_stderr\": 0.024967553196547133\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.04077494709252628,\n \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.04077494709252628\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.4174757281553398,\n \"acc_stderr\": 0.04882840548212237,\n \"acc_norm\": 0.4174757281553398,\n \"acc_norm_stderr\": 0.04882840548212237\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.21367521367521367,\n \"acc_stderr\": 0.02685345037700915,\n \"acc_norm\": 0.21367521367521367,\n \"acc_norm_stderr\": 0.02685345037700915\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2541507024265645,\n \"acc_stderr\": 0.015569254692045793,\n \"acc_norm\": 0.2541507024265645,\n \"acc_norm_stderr\": 0.015569254692045793\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.023532925431044283,\n \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.023532925431044283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02609016250427905,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02609016250427905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2765273311897106,\n \"acc_stderr\": 0.025403832978179604,\n \"acc_norm\": 0.2765273311897106,\n \"acc_norm_stderr\": 0.025403832978179604\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.023016705640262206,\n \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.023016705640262206\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2198581560283688,\n \"acc_stderr\": 0.02470614107070547,\n \"acc_norm\": 0.2198581560283688,\n \"acc_norm_stderr\": 0.02470614107070547\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24902216427640156,\n \"acc_stderr\": 0.01104489226404077,\n \"acc_norm\": 0.24902216427640156,\n \"acc_norm_stderr\": 0.01104489226404077\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329376,\n \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329376\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.016819028375736386,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.016819028375736386\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3306122448979592,\n \"acc_stderr\": 0.030116426296540613,\n \"acc_norm\": 0.3306122448979592,\n \"acc_norm_stderr\": 0.030116426296540613\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21890547263681592,\n \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.21890547263681592,\n \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n \"acc_stderr\": 0.03384429155233135,\n \"acc_norm\": 0.25301204819277107,\n \"acc_norm_stderr\": 0.03384429155233135\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03188578017686399,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03188578017686399\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21297429620563035,\n \"mc1_stderr\": 0.014332203787059683,\n \"mc2\": 0.3796327664824807,\n \"mc2_stderr\": 0.013938529246022055\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6243093922651933,\n \"acc_stderr\": 0.013611257508380438\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/xaviviro/FLOR-6.3B-xat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|arc:challenge|25_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|gsm8k|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hellaswag|10_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T11-25-36.448416.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["**/details_harness|winogrande|5_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-07T11-25-36.448416.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T11_25_36.448416", "path": ["results_2024-01-07T11-25-36.448416.parquet"]}, {"split": "latest", "path": ["results_2024-01-07T11-25-36.448416.parquet"]}]}]} | 2024-01-07T11:28:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of xaviviro/FLOR-6.3B-xat
Dataset automatically created during the evaluation run of model xaviviro/FLOR-6.3B-xat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-07T11:25:36.448416(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of xaviviro/FLOR-6.3B-xat\n\n\n\nDataset automatically created during the evaluation run of model xaviviro/FLOR-6.3B-xat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T11:25:36.448416(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of xaviviro/FLOR-6.3B-xat\n\n\n\nDataset automatically created during the evaluation run of model xaviviro/FLOR-6.3B-xat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T11:25:36.448416(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of xaviviro/FLOR-6.3B-xat\n\n\n\nDataset automatically created during the evaluation run of model xaviviro/FLOR-6.3B-xat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-07T11:25:36.448416(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
03e715d71230a6dc1b62d40311d572ef72306577 | # Dataset Card for "araproje_arc_tr_conf_mgpt_nearestscore_true_y"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_arc_tr_conf_mgpt_nearestscore_true_y | [
"region:us"
] | 2024-01-07T12:04:35+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "sequence": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 86423.0, "num_examples": 250}], "download_size": 50809, "dataset_size": 86423.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-07T12:04:37+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_arc_tr_conf_mgpt_nearestscore_true_y"
More Information needed | [
"# Dataset Card for \"araproje_arc_tr_conf_mgpt_nearestscore_true_y\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_arc_tr_conf_mgpt_nearestscore_true_y\"\n\nMore Information needed"
] | [
6,
34
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_arc_tr_conf_mgpt_nearestscore_true_y\"\n\nMore Information needed"
] |
879be01cb2ce7673422f5be498ba0eb55d46fe85 | # Dataset Card for "araproje_arc_tr_conf_mgpt_nearestscore_true"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_arc_tr_conf_mgpt_nearestscore_true | [
"region:us"
] | 2024-01-07T12:04:40+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "sequence": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 86423.0, "num_examples": 250}], "download_size": 50777, "dataset_size": 86423.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-07T12:04:42+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_arc_tr_conf_mgpt_nearestscore_true"
More Information needed | [
"# Dataset Card for \"araproje_arc_tr_conf_mgpt_nearestscore_true\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_arc_tr_conf_mgpt_nearestscore_true\"\n\nMore Information needed"
] | [
6,
32
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_arc_tr_conf_mgpt_nearestscore_true\"\n\nMore Information needed"
] |
68c98187b19491ce9867702786a0e1dd7449f882 | # Dataset Card for "araproje_arc_en_conf_llama_nearestscore_true_y"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_arc_en_conf_llama_nearestscore_true_y | [
"region:us"
] | 2024-01-07T12:09:20+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "sequence": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 80031.0, "num_examples": 250}], "download_size": 46853, "dataset_size": 80031.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-07T12:09:22+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_arc_en_conf_llama_nearestscore_true_y"
More Information needed | [
"# Dataset Card for \"araproje_arc_en_conf_llama_nearestscore_true_y\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_arc_en_conf_llama_nearestscore_true_y\"\n\nMore Information needed"
] | [
6,
34
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_arc_en_conf_llama_nearestscore_true_y\"\n\nMore Information needed"
] |
4c518f49f81d8f5cdcd50f8c61630a215e3437b4 | # Dataset Card for "araproje_arc_en_conf_llama_nearestscore_true"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_arc_en_conf_llama_nearestscore_true | [
"region:us"
] | 2024-01-07T12:09:25+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "sequence": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 80031.0, "num_examples": 250}], "download_size": 46854, "dataset_size": 80031.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-07T12:09:26+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_arc_en_conf_llama_nearestscore_true"
More Information needed | [
"# Dataset Card for \"araproje_arc_en_conf_llama_nearestscore_true\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_arc_en_conf_llama_nearestscore_true\"\n\nMore Information needed"
] | [
6,
32
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_arc_en_conf_llama_nearestscore_true\"\n\nMore Information needed"
] |
01e85721ec6ec4b8615b203466c3c1c8b1549c8b | # Dataset Card for "araproje_arc_en_conf_llama_nearestscore_true_x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_arc_en_conf_llama_nearestscore_true_x | [
"region:us"
] | 2024-01-07T12:11:07+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "sequence": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 80031.0, "num_examples": 250}], "download_size": 46930, "dataset_size": 80031.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-07T12:11:09+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_arc_en_conf_llama_nearestscore_true_x"
More Information needed | [
"# Dataset Card for \"araproje_arc_en_conf_llama_nearestscore_true_x\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_arc_en_conf_llama_nearestscore_true_x\"\n\nMore Information needed"
] | [
6,
34
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_arc_en_conf_llama_nearestscore_true_x\"\n\nMore Information needed"
] |
db2507c184eeca9575ec196d927be630f2d87505 | # Dataset Card for "araproje_arc_tr_conf_mgpt_nearestscore_true_x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_arc_tr_conf_mgpt_nearestscore_true_x | [
"region:us"
] | 2024-01-07T12:11:41+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "sequence": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 86423.0, "num_examples": 250}], "download_size": 50775, "dataset_size": 86423.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-07T12:11:42+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_arc_tr_conf_mgpt_nearestscore_true_x"
More Information needed | [
"# Dataset Card for \"araproje_arc_tr_conf_mgpt_nearestscore_true_x\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_arc_tr_conf_mgpt_nearestscore_true_x\"\n\nMore Information needed"
] | [
6,
34
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_arc_tr_conf_mgpt_nearestscore_true_x\"\n\nMore Information needed"
] |
234cc0dbe5f2e293cbb4161783825f0bb6f32a3f | # Dataset Card for "araproje_mmlu_tr_conf_mgpt_nearestscore_true_y"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_mmlu_tr_conf_mgpt_nearestscore_true_y | [
"region:us"
] | 2024-01-07T12:18:22+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "validation", "num_bytes": 137404.0, "num_examples": 250}], "download_size": 83747, "dataset_size": 137404.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-07T12:18:23+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_mmlu_tr_conf_mgpt_nearestscore_true_y"
More Information needed | [
"# Dataset Card for \"araproje_mmlu_tr_conf_mgpt_nearestscore_true_y\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_mmlu_tr_conf_mgpt_nearestscore_true_y\"\n\nMore Information needed"
] | [
6,
34
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_mmlu_tr_conf_mgpt_nearestscore_true_y\"\n\nMore Information needed"
] |
30628ced57d5b3045cc5a0dd63b43a6b4c794011 | # Dataset Card for "araproje_mmlu_tr_conf_mgpt_nearestscore_true_x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_mmlu_tr_conf_mgpt_nearestscore_true_x | [
"region:us"
] | 2024-01-07T12:18:25+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "validation", "num_bytes": 137404.0, "num_examples": 250}], "download_size": 83864, "dataset_size": 137404.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-07T12:18:27+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_mmlu_tr_conf_mgpt_nearestscore_true_x"
More Information needed | [
"# Dataset Card for \"araproje_mmlu_tr_conf_mgpt_nearestscore_true_x\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_mmlu_tr_conf_mgpt_nearestscore_true_x\"\n\nMore Information needed"
] | [
6,
34
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_mmlu_tr_conf_mgpt_nearestscore_true_x\"\n\nMore Information needed"
] |
ae9ccdf0ae913fab505dc4d14b655b5529b8d794 | # Dataset Card for "araproje_mmlu_tr_conf_mgpt_nearestscore_true"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_mmlu_tr_conf_mgpt_nearestscore_true | [
"region:us"
] | 2024-01-07T12:18:28+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "validation", "num_bytes": 137404.0, "num_examples": 250}], "download_size": 84068, "dataset_size": 137404.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-07T12:18:30+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_mmlu_tr_conf_mgpt_nearestscore_true"
More Information needed | [
"# Dataset Card for \"araproje_mmlu_tr_conf_mgpt_nearestscore_true\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_mmlu_tr_conf_mgpt_nearestscore_true\"\n\nMore Information needed"
] | [
6,
32
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_mmlu_tr_conf_mgpt_nearestscore_true\"\n\nMore Information needed"
] |
0d2ca38391ee7d3c651a2a475d63d0a54dedfef8 | # Dataset Card for "araproje_mmlu_en_conf_llama_nearestscore_true_y"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_mmlu_en_conf_llama_nearestscore_true_y | [
"region:us"
] | 2024-01-07T12:20:02+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "validation", "num_bytes": 130579.0, "num_examples": 250}], "download_size": 79306, "dataset_size": 130579.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T10:01:24+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_mmlu_en_conf_llama_nearestscore_true_y"
More Information needed | [
"# Dataset Card for \"araproje_mmlu_en_conf_llama_nearestscore_true_y\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_mmlu_en_conf_llama_nearestscore_true_y\"\n\nMore Information needed"
] | [
6,
34
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_mmlu_en_conf_llama_nearestscore_true_y\"\n\nMore Information needed"
] |
9195e1ae9adc9dc2b98e4cc2a2c585f36598b839 | # Dataset Card for "araproje_mmlu_en_conf_llama_nearestscore_true_x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_mmlu_en_conf_llama_nearestscore_true_x | [
"region:us"
] | 2024-01-07T12:20:05+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "validation", "num_bytes": 130579.0, "num_examples": 250}], "download_size": 0, "dataset_size": 130579.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T10:01:26+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_mmlu_en_conf_llama_nearestscore_true_x"
More Information needed | [
"# Dataset Card for \"araproje_mmlu_en_conf_llama_nearestscore_true_x\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_mmlu_en_conf_llama_nearestscore_true_x\"\n\nMore Information needed"
] | [
6,
34
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_mmlu_en_conf_llama_nearestscore_true_x\"\n\nMore Information needed"
] |
8779dd3a09450ef34fd719de8e73372709ed884c | # Dataset Card for "araproje_mmlu_en_conf_llama_nearestscore_true"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_mmlu_en_conf_llama_nearestscore_true | [
"region:us"
] | 2024-01-07T12:20:09+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "validation", "num_bytes": 130579.0, "num_examples": 250}], "download_size": 0, "dataset_size": 130579.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T10:01:28+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_mmlu_en_conf_llama_nearestscore_true"
More Information needed | [
"# Dataset Card for \"araproje_mmlu_en_conf_llama_nearestscore_true\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_mmlu_en_conf_llama_nearestscore_true\"\n\nMore Information needed"
] | [
6,
32
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_mmlu_en_conf_llama_nearestscore_true\"\n\nMore Information needed"
] |
f63e57a2943aa115970ca40e2237fb49885b4639 |
This dataset is derived from [`mteb/sts17-crosslingual-sts`](https://huggingface.co/datasets/mteb/sts17-crosslingual-sts).
We translated `en-en` to `zh-zh` and `id-id` by ChatGPT.
| izhx/sts17-crosslingual-extend | [
"size_categories:n<1K",
"region:us"
] | 2024-01-07T12:32:47+00:00 | {"size_categories": ["n<1K"]} | 2024-01-07T13:14:40+00:00 | [] | [] | TAGS
#size_categories-n<1K #region-us
|
This dataset is derived from 'mteb/sts17-crosslingual-sts'.
We translated 'en-en' to 'zh-zh' and 'id-id' by ChatGPT.
| [] | [
"TAGS\n#size_categories-n<1K #region-us \n"
] | [
16
] | [
"passage: TAGS\n#size_categories-n<1K #region-us \n"
] |
392658e44f099e3541d623c2a2559d4dfe300395 |
# 2000 Chinese RoleCards from IMDB_250 Movies and PIPPA
用于拓展zero-shot角色扮演的角色卡片。
其中870个角色来自电影字幕总结(id为movie_xx),其中406张翻译成了简体中文,剩下的没翻(所以有些繁体或者英文混杂)
1270个角色来自于对PIPPA数据集的翻译
- [凌云志](https://github.com/Kirovsiki)@伯恩茅斯大学 使用射手api爬取了电影的字幕
- 李鲁鲁 完成了从字幕到角色卡片的总结,以及对数据的翻译(openai)
# 后续
我们后续打算用这些卡片 从openai, CharacterGLM, KoboldAI的api中,利用Baize的方式去获得数据。
项目主页 https://github.com/LC1332/Chat-Haruhi-Suzumiya
如果你要讨论加入我们的项目
可以把你的联系方式私信发给 https://www.zhihu.com/people/cheng-li-47 | silk-road/Haruhi-Zero-RolePlaying-movie-PIPPA | [
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:zh",
"license:apache-2.0",
"region:us"
] | 2024-01-07T12:44:45+00:00 | {"language": ["zh"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"]} | 2024-01-07T12:54:56+00:00 | [] | [
"zh"
] | TAGS
#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-apache-2.0 #region-us
|
# 2000 Chinese RoleCards from IMDB_250 Movies and PIPPA
用于拓展zero-shot角色扮演的角色卡片。
其中870个角色来自电影字幕总结(id为movie_xx),其中406张翻译成了简体中文,剩下的没翻(所以有些繁体或者英文混杂)
1270个角色来自于对PIPPA数据集的翻译
- 凌云志@伯恩茅斯大学 使用射手api爬取了电影的字幕
- 李鲁鲁 完成了从字幕到角色卡片的总结,以及对数据的翻译(openai)
# 后续
我们后续打算用这些卡片 从openai, CharacterGLM, KoboldAI的api中,利用Baize的方式去获得数据。
项目主页 URL
如果你要讨论加入我们的项目
可以把你的联系方式私信发给 URL | [
"# 2000 Chinese RoleCards from IMDB_250 Movies and PIPPA\n\n用于拓展zero-shot角色扮演的角色卡片。\n\n其中870个角色来自电影字幕总结(id为movie_xx),其中406张翻译成了简体中文,剩下的没翻(所以有些繁体或者英文混杂)\n\n1270个角色来自于对PIPPA数据集的翻译\n\n- 凌云志@伯恩茅斯大学 使用射手api爬取了电影的字幕\n\n- 李鲁鲁 完成了从字幕到角色卡片的总结,以及对数据的翻译(openai)",
"# 后续\n\n我们后续打算用这些卡片 从openai, CharacterGLM, KoboldAI的api中,利用Baize的方式去获得数据。\n\n项目主页 URL\n\n\n如果你要讨论加入我们的项目\n\n可以把你的联系方式私信发给 URL"
] | [
"TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-apache-2.0 #region-us \n",
"# 2000 Chinese RoleCards from IMDB_250 Movies and PIPPA\n\n用于拓展zero-shot角色扮演的角色卡片。\n\n其中870个角色来自电影字幕总结(id为movie_xx),其中406张翻译成了简体中文,剩下的没翻(所以有些繁体或者英文混杂)\n\n1270个角色来自于对PIPPA数据集的翻译\n\n- 凌云志@伯恩茅斯大学 使用射手api爬取了电影的字幕\n\n- 李鲁鲁 完成了从字幕到角色卡片的总结,以及对数据的翻译(openai)",
"# 后续\n\n我们后续打算用这些卡片 从openai, CharacterGLM, KoboldAI的api中,利用Baize的方式去获得数据。\n\n项目主页 URL\n\n\n如果你要讨论加入我们的项目\n\n可以把你的联系方式私信发给 URL"
] | [
42,
124,
57
] | [
"passage: TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-apache-2.0 #region-us \n# 2000 Chinese RoleCards from IMDB_250 Movies and PIPPA\n\n用于拓展zero-shot角色扮演的角色卡片。\n\n其中870个角色来自电影字幕总结(id为movie_xx),其中406张翻译成了简体中文,剩下的没翻(所以有些繁体或者英文混杂)\n\n1270个角色来自于对PIPPA数据集的翻译\n\n- 凌云志@伯恩茅斯大学 使用射手api爬取了电影的字幕\n\n- 李鲁鲁 完成了从字幕到角色卡片的总结,以及对数据的翻译(openai)# 后续\n\n我们后续打算用这些卡片 从openai, CharacterGLM, KoboldAI的api中,利用Baize的方式去获得数据。\n\n项目主页 URL\n\n\n如果你要讨论加入我们的项目\n\n可以把你的联系方式私信发给 URL"
] |
4234468ced36351bff3dda3075bcd2424afb0456 | ---
license: unknown
---from huggingface_hub import hf_hub_url
hf_hub_url(
repo_id="Maomineang/HisopingV1", filename="hisoping (1).zip"
)
| Maomineang/HisopingV1 | [
"region:us"
] | 2024-01-07T12:45:56+00:00 | {} | 2024-01-07T12:48:28+00:00 | [] | [] | TAGS
#region-us
| ---
license: unknown
---from huggingface_hub import hf_hub_url
hf_hub_url(
repo_id="Maomineang/HisopingV1", filename="hisoping (1).zip"
)
| [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
6dcf185d49848d2a513e9fab11ed9242fcad904e |
This dataset is derived from [stsb_multi_mt](https://huggingface.co/datasets/stsb_multi_mt).
We translated the test set of `en` to `ar` by google translate and `id` by deepl.
| izhx/stsb_multi_mt_extend | [
"multilinguality:multilingual",
"language:de",
"language:en",
"language:es",
"language:fr",
"language:it",
"language:nl",
"language:pl",
"language:pt",
"language:ru",
"language:ar",
"language:id",
"license:cc-by-sa-4.0",
"region:us"
] | 2024-01-07T13:22:53+00:00 | {"language": ["de", "en", "es", "fr", "it", "nl", "pl", "pt", "ru", "ar", "id"], "license": "cc-by-sa-4.0", "multilinguality": ["multilingual"]} | 2024-01-07T14:26:22+00:00 | [] | [
"de",
"en",
"es",
"fr",
"it",
"nl",
"pl",
"pt",
"ru",
"ar",
"id"
] | TAGS
#multilinguality-multilingual #language-German #language-English #language-Spanish #language-French #language-Italian #language-Dutch #language-Polish #language-Portuguese #language-Russian #language-Arabic #language-Indonesian #license-cc-by-sa-4.0 #region-us
|
This dataset is derived from stsb_multi_mt.
We translated the test set of 'en' to 'ar' by google translate and 'id' by deepl.
| [] | [
"TAGS\n#multilinguality-multilingual #language-German #language-English #language-Spanish #language-French #language-Italian #language-Dutch #language-Polish #language-Portuguese #language-Russian #language-Arabic #language-Indonesian #license-cc-by-sa-4.0 #region-us \n"
] | [
81
] | [
"passage: TAGS\n#multilinguality-multilingual #language-German #language-English #language-Spanish #language-French #language-Italian #language-Dutch #language-Polish #language-Portuguese #language-Russian #language-Arabic #language-Indonesian #license-cc-by-sa-4.0 #region-us \n"
] |
613c915cab9709b9fdb67bba38fca97df10777b7 |
## Description
<b>rawrr_v1</b> is highly-experimental pairs style dataset that was created to help with de-contamination of so-called "base" models. \
Field `chosen` contains outputs from base models that weren't instruct tuned and were released directly after pre-training, in a raw format. Some of those outputs are just completions of a prompt, while some are answers to the prompt. \
Field `rejected` contains outputs from models that were contaminated before public release.
To my knowledge, this dataset doesn't contain any toxic, hateful content. \
To my knowledge, this dataset doesn't contain any content that could be deemed illegal in totalitarian countries, but I don't know every piece of law, so it's best if you still exercise proper caution when dealing with malicious regime. \
To my knowledge, all of the prompts in no_robots dataset are pretty benign.
Mix of publicly available models was used for creation of this dataset.
More and more base models nowadays aren't released straight after pre-training the model. Instead, model authors sneak in additional instruct fine-tuning and release only that fine-tuned model, calling it a base model. \
My aim is to try to reverse that process so that researchers and community can posses models resembling the raw model that are primarily aimed at completion instead of instruct. \
Of course, my attempts are not very sophisticated since I am using just my private PC for the dataset generation, so I can't create complex multi-GB synthetic datasets in reasonable time-frames, but I think running DPO with this dataset could still help with this issue.
This dataset is based on HuggingFaceH4/no_robots and winglian/no_robots_rlhf \
Fields `prompt`, `source` and `id` have been kept from base datasets, `chosen` and `rejected` fields have been replaced used synthetic output.
Field `system` has been overwritten with "A chat."
Original dataset released with cc-by-nc-4.0 dataset, so I am keeping it this way.
I used following generation parameters
```json
{
"max_tokens": 600,
"temperature": 0.8,
"temperature_last": "False",
"top_k": 40,
"top_p": 0.8,
"top_a": 0.0,
"n":1,
"min_p": 0,
"repetition_penalty": 1.12,
"repetition_range": 400
}
```
## Issues in v1
To keep the generation time reasonable, I set max_tokens in output to 600. Because of this, some generations in field `chosen` are cut off mid-sentence. I will see whether it's an issue after doing DPO and maybe try to make the max_tokens limit longer for my next attempt or remove those broken replies from this version. \
Also, many responses in `rejected` field start from "I ". Will this be an issue later down the road and will model be unable to respond with first-person view after fine-tuning? Maybe, I don't know. \
no_robots is a dataset with relatively non-permissive cc-by-nc-4.0 license. If you know any ethically sourced permissive human-made dataset that I could use for next version - let me know! \
I thinked about using OpenAssistant dataset for this, but it's file structure is a mess I didn't want to dive into. | adamo1139/rawrr_v1 | [
"license:cc-by-nc-4.0",
"region:us"
] | 2024-01-07T13:44:56+00:00 | {"license": "cc-by-nc-4.0"} | 2024-01-13T22:59:15+00:00 | [] | [] | TAGS
#license-cc-by-nc-4.0 #region-us
|
## Description
<b>rawrr_v1</b> is highly-experimental pairs style dataset that was created to help with de-contamination of so-called "base" models. \
Field 'chosen' contains outputs from base models that weren't instruct tuned and were released directly after pre-training, in a raw format. Some of those outputs are just completions of a prompt, while some are answers to the prompt. \
Field 'rejected' contains outputs from models that were contaminated before public release.
To my knowledge, this dataset doesn't contain any toxic, hateful content. \
To my knowledge, this dataset doesn't contain any content that could be deemed illegal in totalitarian countries, but I don't know every piece of law, so it's best if you still exercise proper caution when dealing with malicious regime. \
To my knowledge, all of the prompts in no_robots dataset are pretty benign.
Mix of publicly available models was used for creation of this dataset.
More and more base models nowadays aren't released straight after pre-training the model. Instead, model authors sneak in additional instruct fine-tuning and release only that fine-tuned model, calling it a base model. \
My aim is to try to reverse that process so that researchers and community can posses models resembling the raw model that are primarily aimed at completion instead of instruct. \
Of course, my attempts are not very sophisticated since I am using just my private PC for the dataset generation, so I can't create complex multi-GB synthetic datasets in reasonable time-frames, but I think running DPO with this dataset could still help with this issue.
This dataset is based on HuggingFaceH4/no_robots and winglian/no_robots_rlhf \
Fields 'prompt', 'source' and 'id' have been kept from base datasets, 'chosen' and 'rejected' fields have been replaced used synthetic output.
Field 'system' has been overwritten with "A chat."
Original dataset released with cc-by-nc-4.0 dataset, so I am keeping it this way.
I used following generation parameters
## Issues in v1
To keep the generation time reasonable, I set max_tokens in output to 600. Because of this, some generations in field 'chosen' are cut off mid-sentence. I will see whether it's an issue after doing DPO and maybe try to make the max_tokens limit longer for my next attempt or remove those broken replies from this version. \
Also, many responses in 'rejected' field start from "I ". Will this be an issue later down the road and will model be unable to respond with first-person view after fine-tuning? Maybe, I don't know. \
no_robots is a dataset with relatively non-permissive cc-by-nc-4.0 license. If you know any ethically sourced permissive human-made dataset that I could use for next version - let me know! \
I thinked about using OpenAssistant dataset for this, but it's file structure is a mess I didn't want to dive into. | [
"## Description\n\n<b>rawrr_v1</b> is highly-experimental pairs style dataset that was created to help with de-contamination of so-called \"base\" models. \\\nField 'chosen' contains outputs from base models that weren't instruct tuned and were released directly after pre-training, in a raw format. Some of those outputs are just completions of a prompt, while some are answers to the prompt. \\\nField 'rejected' contains outputs from models that were contaminated before public release. \n\nTo my knowledge, this dataset doesn't contain any toxic, hateful content. \\\nTo my knowledge, this dataset doesn't contain any content that could be deemed illegal in totalitarian countries, but I don't know every piece of law, so it's best if you still exercise proper caution when dealing with malicious regime. \\\nTo my knowledge, all of the prompts in no_robots dataset are pretty benign.\n\nMix of publicly available models was used for creation of this dataset.\n\nMore and more base models nowadays aren't released straight after pre-training the model. Instead, model authors sneak in additional instruct fine-tuning and release only that fine-tuned model, calling it a base model. \\\nMy aim is to try to reverse that process so that researchers and community can posses models resembling the raw model that are primarily aimed at completion instead of instruct. \\\nOf course, my attempts are not very sophisticated since I am using just my private PC for the dataset generation, so I can't create complex multi-GB synthetic datasets in reasonable time-frames, but I think running DPO with this dataset could still help with this issue.\n\nThis dataset is based on HuggingFaceH4/no_robots and winglian/no_robots_rlhf \\\nFields 'prompt', 'source' and 'id' have been kept from base datasets, 'chosen' and 'rejected' fields have been replaced used synthetic output.\nField 'system' has been overwritten with \"A chat.\"\nOriginal dataset released with cc-by-nc-4.0 dataset, so I am keeping it this way.\n\nI used following generation parameters\n\n\n\n\n\n\n ## Issues in v1\n\n To keep the generation time reasonable, I set max_tokens in output to 600. Because of this, some generations in field 'chosen' are cut off mid-sentence. I will see whether it's an issue after doing DPO and maybe try to make the max_tokens limit longer for my next attempt or remove those broken replies from this version. \\\n Also, many responses in 'rejected' field start from \"I \". Will this be an issue later down the road and will model be unable to respond with first-person view after fine-tuning? Maybe, I don't know. \\\n no_robots is a dataset with relatively non-permissive cc-by-nc-4.0 license. If you know any ethically sourced permissive human-made dataset that I could use for next version - let me know! \\\n I thinked about using OpenAssistant dataset for this, but it's file structure is a mess I didn't want to dive into."
] | [
"TAGS\n#license-cc-by-nc-4.0 #region-us \n",
"## Description\n\n<b>rawrr_v1</b> is highly-experimental pairs style dataset that was created to help with de-contamination of so-called \"base\" models. \\\nField 'chosen' contains outputs from base models that weren't instruct tuned and were released directly after pre-training, in a raw format. Some of those outputs are just completions of a prompt, while some are answers to the prompt. \\\nField 'rejected' contains outputs from models that were contaminated before public release. \n\nTo my knowledge, this dataset doesn't contain any toxic, hateful content. \\\nTo my knowledge, this dataset doesn't contain any content that could be deemed illegal in totalitarian countries, but I don't know every piece of law, so it's best if you still exercise proper caution when dealing with malicious regime. \\\nTo my knowledge, all of the prompts in no_robots dataset are pretty benign.\n\nMix of publicly available models was used for creation of this dataset.\n\nMore and more base models nowadays aren't released straight after pre-training the model. Instead, model authors sneak in additional instruct fine-tuning and release only that fine-tuned model, calling it a base model. \\\nMy aim is to try to reverse that process so that researchers and community can posses models resembling the raw model that are primarily aimed at completion instead of instruct. \\\nOf course, my attempts are not very sophisticated since I am using just my private PC for the dataset generation, so I can't create complex multi-GB synthetic datasets in reasonable time-frames, but I think running DPO with this dataset could still help with this issue.\n\nThis dataset is based on HuggingFaceH4/no_robots and winglian/no_robots_rlhf \\\nFields 'prompt', 'source' and 'id' have been kept from base datasets, 'chosen' and 'rejected' fields have been replaced used synthetic output.\nField 'system' has been overwritten with \"A chat.\"\nOriginal dataset released with cc-by-nc-4.0 dataset, so I am keeping it this way.\n\nI used following generation parameters\n\n\n\n\n\n\n ## Issues in v1\n\n To keep the generation time reasonable, I set max_tokens in output to 600. Because of this, some generations in field 'chosen' are cut off mid-sentence. I will see whether it's an issue after doing DPO and maybe try to make the max_tokens limit longer for my next attempt or remove those broken replies from this version. \\\n Also, many responses in 'rejected' field start from \"I \". Will this be an issue later down the road and will model be unable to respond with first-person view after fine-tuning? Maybe, I don't know. \\\n no_robots is a dataset with relatively non-permissive cc-by-nc-4.0 license. If you know any ethically sourced permissive human-made dataset that I could use for next version - let me know! \\\n I thinked about using OpenAssistant dataset for this, but it's file structure is a mess I didn't want to dive into."
] | [
17,
750
] | [
"passage: TAGS\n#license-cc-by-nc-4.0 #region-us \n"
] |
36f7916511a28aea6d24a31f2b722bd7c0c83a08 |
# Dataset Card for Evaluation run of AA051610/A0106
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051610/A0106](https://huggingface.co/AA051610/A0106) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__A0106",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-07T23:07:28.080056](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__A0106/blob/main/results_2024-01-07T23-07-28.080056.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7368670822409534,
"acc_stderr": 0.029070974182818815,
"acc_norm": 0.7408880337597785,
"acc_norm_stderr": 0.029626716433824633,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.01711581563241819,
"mc2": 0.5782373220428066,
"mc2_stderr": 0.015253933341089682
},
"harness|arc:challenge|25": {
"acc": 0.6416382252559727,
"acc_stderr": 0.014012883334859859,
"acc_norm": 0.6646757679180887,
"acc_norm_stderr": 0.013796182947785562
},
"harness|hellaswag|10": {
"acc": 0.6574387572196774,
"acc_stderr": 0.004735962781136062,
"acc_norm": 0.8505277833100976,
"acc_norm_stderr": 0.003558246300379053
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02967416752010147,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02967416752010147
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8113207547169812,
"acc_stderr": 0.024079995130062253,
"acc_norm": 0.8113207547169812,
"acc_norm_stderr": 0.024079995130062253
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.03476599607516478,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.03476599607516478
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7702127659574468,
"acc_stderr": 0.02750175294441242,
"acc_norm": 0.7702127659574468,
"acc_norm_stderr": 0.02750175294441242
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7103448275862069,
"acc_stderr": 0.03780019230438015,
"acc_norm": 0.7103448275862069,
"acc_norm_stderr": 0.03780019230438015
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.02487081525105709,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.02487081525105709
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8709677419354839,
"acc_stderr": 0.019070889254792747,
"acc_norm": 0.8709677419354839,
"acc_norm_stderr": 0.019070889254792747
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5812807881773399,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.5812807881773399,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322605,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9040404040404041,
"acc_stderr": 0.020984808610047912,
"acc_norm": 0.9040404040404041,
"acc_norm_stderr": 0.020984808610047912
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.014385432857476458,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.014385432857476458
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8102564102564103,
"acc_stderr": 0.019880165406588796,
"acc_norm": 0.8102564102564103,
"acc_norm_stderr": 0.019880165406588796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398904,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8917431192660551,
"acc_stderr": 0.013321348447611759,
"acc_norm": 0.8917431192660551,
"acc_norm_stderr": 0.013321348447611759
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316942,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316942
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065515,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065515
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.0309227883204458,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.0309227883204458
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.03343270062869622,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.03343270062869622
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8466257668711656,
"acc_stderr": 0.0283116014414386,
"acc_norm": 0.8466257668711656,
"acc_norm_stderr": 0.0283116014414386
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5982142857142857,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.5982142857142857,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.9029126213592233,
"acc_stderr": 0.02931596291881348,
"acc_norm": 0.9029126213592233,
"acc_norm_stderr": 0.02931596291881348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.016534627684311357,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.016534627684311357
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9042145593869731,
"acc_stderr": 0.010524031079055822,
"acc_norm": 0.9042145593869731,
"acc_norm_stderr": 0.010524031079055822
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8005780346820809,
"acc_stderr": 0.021511900654252555,
"acc_norm": 0.8005780346820809,
"acc_norm_stderr": 0.021511900654252555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6134078212290502,
"acc_stderr": 0.016286674879101022,
"acc_norm": 0.6134078212290502,
"acc_norm_stderr": 0.016286674879101022
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8300653594771242,
"acc_stderr": 0.02150538312123138,
"acc_norm": 0.8300653594771242,
"acc_norm_stderr": 0.02150538312123138
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8135048231511254,
"acc_stderr": 0.022122439772480774,
"acc_norm": 0.8135048231511254,
"acc_norm_stderr": 0.022122439772480774
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8395061728395061,
"acc_stderr": 0.020423955354778034,
"acc_norm": 0.8395061728395061,
"acc_norm_stderr": 0.020423955354778034
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5851063829787234,
"acc_stderr": 0.029392236584612496,
"acc_norm": 0.5851063829787234,
"acc_norm_stderr": 0.029392236584612496
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5528031290743155,
"acc_stderr": 0.012698825252435113,
"acc_norm": 0.5528031290743155,
"acc_norm_stderr": 0.012698825252435113
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8345588235294118,
"acc_stderr": 0.022571771025494767,
"acc_norm": 0.8345588235294118,
"acc_norm_stderr": 0.022571771025494767
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7924836601307189,
"acc_stderr": 0.016405924270103237,
"acc_norm": 0.7924836601307189,
"acc_norm_stderr": 0.016405924270103237
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9054726368159204,
"acc_stderr": 0.020687186951534094,
"acc_norm": 0.9054726368159204,
"acc_norm_stderr": 0.020687186951534094
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.02567934272327692,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.02567934272327692
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.01711581563241819,
"mc2": 0.5782373220428066,
"mc2_stderr": 0.015253933341089682
},
"harness|winogrande|5": {
"acc": 0.8271507498026835,
"acc_stderr": 0.010626964529971864
},
"harness|gsm8k|5": {
"acc": 0.6254738438210766,
"acc_stderr": 0.013331774158491384
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AA051610__A0106 | [
"region:us"
] | 2024-01-07T13:49:16+00:00 | {"pretty_name": "Evaluation run of AA051610/A0106", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051610/A0106](https://huggingface.co/AA051610/A0106) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__A0106\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-07T23:07:28.080056](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__A0106/blob/main/results_2024-01-07T23-07-28.080056.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7368670822409534,\n \"acc_stderr\": 0.029070974182818815,\n \"acc_norm\": 0.7408880337597785,\n \"acc_norm_stderr\": 0.029626716433824633,\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.01711581563241819,\n \"mc2\": 0.5782373220428066,\n \"mc2_stderr\": 0.015253933341089682\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6416382252559727,\n \"acc_stderr\": 0.014012883334859859,\n \"acc_norm\": 0.6646757679180887,\n \"acc_norm_stderr\": 0.013796182947785562\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6574387572196774,\n \"acc_stderr\": 0.004735962781136062,\n \"acc_norm\": 0.8505277833100976,\n \"acc_norm_stderr\": 0.003558246300379053\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02967416752010147,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02967416752010147\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8113207547169812,\n \"acc_stderr\": 0.024079995130062253,\n \"acc_norm\": 0.8113207547169812,\n \"acc_norm_stderr\": 0.024079995130062253\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.03476599607516478,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.03476599607516478\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7702127659574468,\n \"acc_stderr\": 0.02750175294441242,\n \"acc_norm\": 0.7702127659574468,\n \"acc_norm_stderr\": 0.02750175294441242\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7103448275862069,\n \"acc_stderr\": 0.03780019230438015,\n \"acc_norm\": 0.7103448275862069,\n \"acc_norm_stderr\": 0.03780019230438015\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.02487081525105709,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.02487081525105709\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8709677419354839,\n \"acc_stderr\": 0.019070889254792747,\n \"acc_norm\": 0.8709677419354839,\n \"acc_norm_stderr\": 0.019070889254792747\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5812807881773399,\n \"acc_stderr\": 0.03471192860518468,\n \"acc_norm\": 0.5812807881773399,\n \"acc_norm_stderr\": 0.03471192860518468\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322605,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322605\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9040404040404041,\n \"acc_stderr\": 0.020984808610047912,\n \"acc_norm\": 0.9040404040404041,\n \"acc_norm_stderr\": 0.020984808610047912\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.014385432857476458,\n \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.014385432857476458\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8102564102564103,\n \"acc_stderr\": 0.019880165406588796,\n \"acc_norm\": 0.8102564102564103,\n \"acc_norm_stderr\": 0.019880165406588796\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8917431192660551,\n \"acc_stderr\": 0.013321348447611759,\n \"acc_norm\": 0.8917431192660551,\n \"acc_norm_stderr\": 0.013321348447611759\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.03309682581119035,\n \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.03309682581119035\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316942,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316942\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065515,\n \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065515\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.0309227883204458,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.0309227883204458\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.03343270062869622,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.03343270062869622\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8466257668711656,\n \"acc_stderr\": 0.0283116014414386,\n \"acc_norm\": 0.8466257668711656,\n \"acc_norm_stderr\": 0.0283116014414386\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5982142857142857,\n \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.5982142857142857,\n \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9029126213592233,\n \"acc_stderr\": 0.02931596291881348,\n \"acc_norm\": 0.9029126213592233,\n \"acc_norm_stderr\": 0.02931596291881348\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n \"acc_stderr\": 0.016534627684311357,\n \"acc_norm\": 0.9316239316239316,\n \"acc_norm_stderr\": 0.016534627684311357\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9042145593869731,\n \"acc_stderr\": 0.010524031079055822,\n \"acc_norm\": 0.9042145593869731,\n \"acc_norm_stderr\": 0.010524031079055822\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8005780346820809,\n \"acc_stderr\": 0.021511900654252555,\n \"acc_norm\": 0.8005780346820809,\n \"acc_norm_stderr\": 0.021511900654252555\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6134078212290502,\n \"acc_stderr\": 0.016286674879101022,\n \"acc_norm\": 0.6134078212290502,\n \"acc_norm_stderr\": 0.016286674879101022\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8300653594771242,\n \"acc_stderr\": 0.02150538312123138,\n \"acc_norm\": 0.8300653594771242,\n \"acc_norm_stderr\": 0.02150538312123138\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8135048231511254,\n \"acc_stderr\": 0.022122439772480774,\n \"acc_norm\": 0.8135048231511254,\n \"acc_norm_stderr\": 0.022122439772480774\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8395061728395061,\n \"acc_stderr\": 0.020423955354778034,\n \"acc_norm\": 0.8395061728395061,\n \"acc_norm_stderr\": 0.020423955354778034\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5851063829787234,\n \"acc_stderr\": 0.029392236584612496,\n \"acc_norm\": 0.5851063829787234,\n \"acc_norm_stderr\": 0.029392236584612496\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5528031290743155,\n \"acc_stderr\": 0.012698825252435113,\n \"acc_norm\": 0.5528031290743155,\n \"acc_norm_stderr\": 0.012698825252435113\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8345588235294118,\n \"acc_stderr\": 0.022571771025494767,\n \"acc_norm\": 0.8345588235294118,\n \"acc_norm_stderr\": 0.022571771025494767\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7924836601307189,\n \"acc_stderr\": 0.016405924270103237,\n \"acc_norm\": 0.7924836601307189,\n \"acc_norm_stderr\": 0.016405924270103237\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9054726368159204,\n \"acc_stderr\": 0.020687186951534094,\n \"acc_norm\": 0.9054726368159204,\n \"acc_norm_stderr\": 0.020687186951534094\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.02567934272327692,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.02567934272327692\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.01711581563241819,\n \"mc2\": 0.5782373220428066,\n \"mc2_stderr\": 0.015253933341089682\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8271507498026835,\n \"acc_stderr\": 0.010626964529971864\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6254738438210766,\n \"acc_stderr\": 0.013331774158491384\n }\n}\n```", "repo_url": "https://huggingface.co/AA051610/A0106", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|arc:challenge|25_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|arc:challenge|25_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|gsm8k|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|gsm8k|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hellaswag|10_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hellaswag|10_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T13-47-03.450594.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T23-07-28.080056.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["**/details_harness|winogrande|5_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["**/details_harness|winogrande|5_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-07T23-07-28.080056.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T13_47_03.450594", "path": ["results_2024-01-07T13-47-03.450594.parquet"]}, {"split": "2024_01_07T23_07_28.080056", "path": ["results_2024-01-07T23-07-28.080056.parquet"]}, {"split": "latest", "path": ["results_2024-01-07T23-07-28.080056.parquet"]}]}]} | 2024-01-07T23:10:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AA051610/A0106
Dataset automatically created during the evaluation run of model AA051610/A0106 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-07T23:07:28.080056(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AA051610/A0106\n\n\n\nDataset automatically created during the evaluation run of model AA051610/A0106 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T23:07:28.080056(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AA051610/A0106\n\n\n\nDataset automatically created during the evaluation run of model AA051610/A0106 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T23:07:28.080056(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
177,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of AA051610/A0106\n\n\n\nDataset automatically created during the evaluation run of model AA051610/A0106 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-07T23:07:28.080056(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
2de5a16770cc763c368c681fd4d3e3ff95fd8d97 | # No Robots Veriseti Kartı 🙅♂️🤖
### Özet
No Robots 10000 komut ve gösterimden oluşan, profesyonel etiketleyiciler tarafından oluşturulmuş bir verisetidir. Çevirisi Google Cloud Platform Translation API ile yapıldı. Bu veriset LLM'lere komut takibi öğretmek için kullanılabilir. (Instruction Supervised Fine-tuning - SFT)
No Robots veriseti OpenAI'ın [InstructGPT makalesinden](https://huggingface.co/papers/2203.02155) esinlenerek oluşturulmuştur ve aşağıdaki kategorilere sahiptir:
| Kategori | Adet |
|:-----------|--------:|
| Generation | 4560 |
| Open QA | 1240 |
| Brainstorm | 1120 |
| Chat | 850 |
| Rewrite | 660 |
| Summarize | 420 |
| Coding | 350 |
| Classify | 350 |
| Closed QA | 260 |
| Extract | 190 |
### Diller
Bu verisetinde sadece Türkçe var.
## Veriseti Yapısı
Bu verisetini CSV olarak yükledim. Örneklerin neye benzediğini görmek istiyorsanız widget'a bakın.
### Veri Alanları
Kolonlar aşağıdaki gibidir:
* `prompt`: Modelin takip etmesi gereken komutu belirler.
* `prompt_id`: Unique identifier.
* `messages`: Dictionary'ler içeren liste, her dictionary bir mesajı (key: content) ve o mesajı kimin gönderdiğini (key: role) açıklar.
* `category`: Görevin kategorisi, bunu çevirmedim.
### Split'ler
| | train_sft | test_sft |
|---------------|------:| ---: |
| no_robots | 9500 | 500 |
### Lisans
Bu veriseti ne yazık ki açık kaynak değil açık erişimli. Lisansı [Creative Commons NonCommercial (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/legalcode).
Eğer verisetinin kendisi açık kaynak olursa bu veriseti de açık kaynak olacaktır, çünkü çevirisini çeviriler üstünde fikri mülkiyet istemeyen GCP tarafından yaptım.
### Citation
```
@misc{no_robots,
author = {Nazneen Rajani and Lewis Tunstall and Edward Beeching and Nathan Lambert and Alexander M. Rush and Thomas Wolf},
title = {No Robots},
year = {2023},
publisher = {Hugging Face},
journal = {Hugging Face repository},
howpublished = {\url{https://huggingface.co/datasets/HuggingFaceH4/no_robots}}
}
``` | merve/tr-h4-norobots | [
"task_categories:conversational",
"task_categories:text-generation",
"language:tr",
"license:cc-by-nc-4.0",
"arxiv:2203.02155",
"region:us"
] | 2024-01-07T13:50:32+00:00 | {"language": ["tr"], "license": "cc-by-nc-4.0", "task_categories": ["conversational", "text-generation"], "pretty_name": "No Robots", "configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "train_sft.csv"}, {"split": "test_sft", "path": "test_sft.csv"}]}], "dataset_info": {"features": [{"name": "idx", "dtype": "int"}, {"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "category", "dtype": "string"}], "splits": [{"name": "train_sft"}, {"name": "test_sft"}]}} | 2024-01-07T15:56:24+00:00 | [
"2203.02155"
] | [
"tr"
] | TAGS
#task_categories-conversational #task_categories-text-generation #language-Turkish #license-cc-by-nc-4.0 #arxiv-2203.02155 #region-us
| No Robots Veriseti Kartı ️
===========================
### Özet
No Robots 10000 komut ve gösterimden oluşan, profesyonel etiketleyiciler tarafından oluşturulmuş bir verisetidir. Çevirisi Google Cloud Platform Translation API ile yapıldı. Bu veriset LLM'lere komut takibi öğretmek için kullanılabilir. (Instruction Supervised Fine-tuning - SFT)
No Robots veriseti OpenAI'ın InstructGPT makalesinden esinlenerek oluşturulmuştur ve aşağıdaki kategorilere sahiptir:
### Diller
Bu verisetinde sadece Türkçe var.
Veriseti Yapısı
---------------
Bu verisetini CSV olarak yükledim. Örneklerin neye benzediğini görmek istiyorsanız widget'a bakın.
### Veri Alanları
Kolonlar aşağıdaki gibidir:
* 'prompt': Modelin takip etmesi gereken komutu belirler.
* 'prompt\_id': Unique identifier.
* 'messages': Dictionary'ler içeren liste, her dictionary bir mesajı (key: content) ve o mesajı kimin gönderdiğini (key: role) açıklar.
* 'category': Görevin kategorisi, bunu çevirmedim.
### Split'ler
### Lisans
Bu veriseti ne yazık ki açık kaynak değil açık erişimli. Lisansı Creative Commons NonCommercial (CC BY-NC 4.0).
Eğer verisetinin kendisi açık kaynak olursa bu veriseti de açık kaynak olacaktır, çünkü çevirisini çeviriler üstünde fikri mülkiyet istemeyen GCP tarafından yaptım.
| [
"### Özet\n\n\nNo Robots 10000 komut ve gösterimden oluşan, profesyonel etiketleyiciler tarafından oluşturulmuş bir verisetidir. Çevirisi Google Cloud Platform Translation API ile yapıldı. Bu veriset LLM'lere komut takibi öğretmek için kullanılabilir. (Instruction Supervised Fine-tuning - SFT)\nNo Robots veriseti OpenAI'ın InstructGPT makalesinden esinlenerek oluşturulmuştur ve aşağıdaki kategorilere sahiptir:",
"### Diller\n\n\nBu verisetinde sadece Türkçe var.\n\n\nVeriseti Yapısı\n---------------\n\n\nBu verisetini CSV olarak yükledim. Örneklerin neye benzediğini görmek istiyorsanız widget'a bakın.",
"### Veri Alanları\n\n\nKolonlar aşağıdaki gibidir:\n\n\n* 'prompt': Modelin takip etmesi gereken komutu belirler.\n* 'prompt\\_id': Unique identifier.\n* 'messages': Dictionary'ler içeren liste, her dictionary bir mesajı (key: content) ve o mesajı kimin gönderdiğini (key: role) açıklar.\n* 'category': Görevin kategorisi, bunu çevirmedim.",
"### Split'ler",
"### Lisans\n\n\nBu veriseti ne yazık ki açık kaynak değil açık erişimli. Lisansı Creative Commons NonCommercial (CC BY-NC 4.0).\nEğer verisetinin kendisi açık kaynak olursa bu veriseti de açık kaynak olacaktır, çünkü çevirisini çeviriler üstünde fikri mülkiyet istemeyen GCP tarafından yaptım."
] | [
"TAGS\n#task_categories-conversational #task_categories-text-generation #language-Turkish #license-cc-by-nc-4.0 #arxiv-2203.02155 #region-us \n",
"### Özet\n\n\nNo Robots 10000 komut ve gösterimden oluşan, profesyonel etiketleyiciler tarafından oluşturulmuş bir verisetidir. Çevirisi Google Cloud Platform Translation API ile yapıldı. Bu veriset LLM'lere komut takibi öğretmek için kullanılabilir. (Instruction Supervised Fine-tuning - SFT)\nNo Robots veriseti OpenAI'ın InstructGPT makalesinden esinlenerek oluşturulmuştur ve aşağıdaki kategorilere sahiptir:",
"### Diller\n\n\nBu verisetinde sadece Türkçe var.\n\n\nVeriseti Yapısı\n---------------\n\n\nBu verisetini CSV olarak yükledim. Örneklerin neye benzediğini görmek istiyorsanız widget'a bakın.",
"### Veri Alanları\n\n\nKolonlar aşağıdaki gibidir:\n\n\n* 'prompt': Modelin takip etmesi gereken komutu belirler.\n* 'prompt\\_id': Unique identifier.\n* 'messages': Dictionary'ler içeren liste, her dictionary bir mesajı (key: content) ve o mesajı kimin gönderdiğini (key: role) açıklar.\n* 'category': Görevin kategorisi, bunu çevirmedim.",
"### Split'ler",
"### Lisans\n\n\nBu veriseti ne yazık ki açık kaynak değil açık erişimli. Lisansı Creative Commons NonCommercial (CC BY-NC 4.0).\nEğer verisetinin kendisi açık kaynak olursa bu veriseti de açık kaynak olacaktır, çünkü çevirisini çeviriler üstünde fikri mülkiyet istemeyen GCP tarafından yaptım."
] | [
52,
99,
46,
102,
5,
68
] | [
"passage: TAGS\n#task_categories-conversational #task_categories-text-generation #language-Turkish #license-cc-by-nc-4.0 #arxiv-2203.02155 #region-us \n### Özet\n\n\nNo Robots 10000 komut ve gösterimden oluşan, profesyonel etiketleyiciler tarafından oluşturulmuş bir verisetidir. Çevirisi Google Cloud Platform Translation API ile yapıldı. Bu veriset LLM'lere komut takibi öğretmek için kullanılabilir. (Instruction Supervised Fine-tuning - SFT)\nNo Robots veriseti OpenAI'ın InstructGPT makalesinden esinlenerek oluşturulmuştur ve aşağıdaki kategorilere sahiptir:### Diller\n\n\nBu verisetinde sadece Türkçe var.\n\n\nVeriseti Yapısı\n---------------\n\n\nBu verisetini CSV olarak yükledim. Örneklerin neye benzediğini görmek istiyorsanız widget'a bakın.### Veri Alanları\n\n\nKolonlar aşağıdaki gibidir:\n\n\n* 'prompt': Modelin takip etmesi gereken komutu belirler.\n* 'prompt\\_id': Unique identifier.\n* 'messages': Dictionary'ler içeren liste, her dictionary bir mesajı (key: content) ve o mesajı kimin gönderdiğini (key: role) açıklar.\n* 'category': Görevin kategorisi, bunu çevirmedim.### Split'ler### Lisans\n\n\nBu veriseti ne yazık ki açık kaynak değil açık erişimli. Lisansı Creative Commons NonCommercial (CC BY-NC 4.0).\nEğer verisetinin kendisi açık kaynak olursa bu veriseti de açık kaynak olacaktır, çünkü çevirisini çeviriler üstünde fikri mülkiyet istemeyen GCP tarafından yaptım."
] |
657035312e41ea944a91b1e900fb377cca8a84e5 | Training model for gurmatgpt | adsazad/gurmat-dataset | [
"region:us"
] | 2024-01-07T14:51:10+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "train.csv"}]}]} | 2024-01-07T18:21:47+00:00 | [] | [] | TAGS
#region-us
| Training model for gurmatgpt | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
5b6eea02a7b2eeee5a06c1c3f85b4996839a7a03 | 未清洗的中文H小说
仅供科学研究使用! | Limour/h-corpus-raw | [
"language:zh",
"license:apache-2.0",
"not-for-all-audiences",
"region:us"
] | 2024-01-07T15:14:28+00:00 | {"language": ["zh"], "license": "apache-2.0", "tags": ["not-for-all-audiences"]} | 2024-01-20T07:42:21+00:00 | [] | [
"zh"
] | TAGS
#language-Chinese #license-apache-2.0 #not-for-all-audiences #region-us
| 未清洗的中文H小说
仅供科学研究使用! | [] | [
"TAGS\n#language-Chinese #license-apache-2.0 #not-for-all-audiences #region-us \n"
] | [
28
] | [
"passage: TAGS\n#language-Chinese #license-apache-2.0 #not-for-all-audiences #region-us \n"
] |
fee9f1d3ad86fabcf9db82d780858911b7a3babf | # Dataset Card for "NWGI_test"
## Dataset Class:
``` python
class NWGI(InstructionDataset):
dataset = "NWGI"
task_type = "classification"
choices = [
'strong negative', 'moderately negative', 'mildly negative', 'neutral',
'mildly positive', 'moderately positive', 'strong positive',
]
prompt = '''What is the sentiment of this news?
{input}
Please choose an answer from {{strong negative/moderately negative/mildly negative/neutral/mildly positive/moderately positive/strong positive}}.'''
def fetch_data(self, datum):
return {'input': datum['input'], 'answer': datum['answer']}
``` | PIXIU-fin/NWGI_test | [
"language:en",
"license:mit",
"region:us"
] | 2024-01-07T16:11:46+00:00 | {"language": ["en"], "license": "mit", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "valid", "path": "data/valid-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "query", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "gold", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 6885237, "num_examples": 12947}, {"name": "valid", "num_bytes": 1713486, "num_examples": 3237}, {"name": "test", "num_bytes": 2154392, "num_examples": 4047}], "download_size": 2852286, "dataset_size": 10753115}} | 2024-01-07T16:15:04+00:00 | [] | [
"en"
] | TAGS
#language-English #license-mit #region-us
| # Dataset Card for "NWGI_test"
## Dataset Class:
| [
"# Dataset Card for \"NWGI_test\"",
"## Dataset Class:"
] | [
"TAGS\n#language-English #license-mit #region-us \n",
"# Dataset Card for \"NWGI_test\"",
"## Dataset Class:"
] | [
15,
11,
5
] | [
"passage: TAGS\n#language-English #license-mit #region-us \n# Dataset Card for \"NWGI_test\"## Dataset Class:"
] |
e57176233da3a746b715e74caeb76d2392bf6ab4 | Виды текстового мусора в датасете:
1. Лицо на клавиатуре. (ойшойвщф фващощфащшгй0ш шйждыфл) - мусор выглядит как случайно набранные слова. Собрать такой мусор довольно просто. Нужно рандомно генерировать "слова" различной длины и с некоторой вероятностью вставлять знаки препинания между словами и в конце предложения.
2. Набор несвязных слов. (замок двойка иван кванты чат). Чаще всего является набором ключевых слов на каком-то сайте, деталями интерфейса. Генерация подобного мусора тоже не сложна. Берем предложения из корпусов (в моем случае librusec и web_public отсюда) токенизируем, перемешиваем токены и все.
3. Тексты с содержанием грамматических ошибок, ошибок в смысле слов или любые синтаксические отклонения, из-за которых предложение теряет связный смысл. (ученик учится в школа). Данный тип текстов генерируется с помощью случайного склонения данного слова.
4. Нейросетевой бред. Этот класс мусора похож на предыдущий, но не всегда заключается в неверных склонениях. (колонок настроен для лиц через 18 лет, в бильярдном кадре перекатывать)
Blogpost: [link](https://t.me/den4ikresearch/9) | Den4ikAI/gibberish_dataset | [
"task_categories:text-classification",
"size_categories:10K<n<100K",
"language:ru",
"license:apache-2.0",
"region:us"
] | 2024-01-07T17:55:37+00:00 | {"language": ["ru"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-classification"]} | 2024-01-07T18:20:25+00:00 | [] | [
"ru"
] | TAGS
#task_categories-text-classification #size_categories-10K<n<100K #language-Russian #license-apache-2.0 #region-us
| Виды текстового мусора в датасете:
1. Лицо на клавиатуре. (ойшойвщф фващощфащшгй0ш шйждыфл) - мусор выглядит как случайно набранные слова. Собрать такой мусор довольно просто. Нужно рандомно генерировать "слова" различной длины и с некоторой вероятностью вставлять знаки препинания между словами и в конце предложения.
2. Набор несвязных слов. (замок двойка иван кванты чат). Чаще всего является набором ключевых слов на каком-то сайте, деталями интерфейса. Генерация подобного мусора тоже не сложна. Берем предложения из корпусов (в моем случае librusec и web_public отсюда) токенизируем, перемешиваем токены и все.
3. Тексты с содержанием грамматических ошибок, ошибок в смысле слов или любые синтаксические отклонения, из-за которых предложение теряет связный смысл. (ученик учится в школа). Данный тип текстов генерируется с помощью случайного склонения данного слова.
4. Нейросетевой бред. Этот класс мусора похож на предыдущий, но не всегда заключается в неверных склонениях. (колонок настроен для лиц через 18 лет, в бильярдном кадре перекатывать)
Blogpost: link | [] | [
"TAGS\n#task_categories-text-classification #size_categories-10K<n<100K #language-Russian #license-apache-2.0 #region-us \n"
] | [
42
] | [
"passage: TAGS\n#task_categories-text-classification #size_categories-10K<n<100K #language-Russian #license-apache-2.0 #region-us \n"
] |
2746a8fa3d92619df6d0d0a96cfa10e647d188cb | Multilingual version of https://huggingface.co/datasets/nuprl/EditPackFT | nuprl/EditPackFT-Multi | [
"region:us"
] | 2024-01-07T18:28:07+00:00 | {"dataset_info": {"features": [{"name": "commit", "dtype": "string"}, {"name": "old_file", "dtype": "string"}, {"name": "new_file", "dtype": "string"}, {"name": "old_contents", "dtype": "string"}, {"name": "new_contents", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "message", "dtype": "string"}, {"name": "lang", "dtype": "string"}, {"name": "license", "dtype": "string"}, {"name": "repos", "dtype": "string"}, {"name": "config", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 653871806.0, "num_examples": 311591}], "download_size": 344314938, "dataset_size": 653871806.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-07T18:55:38+00:00 | [] | [] | TAGS
#region-us
| Multilingual version of URL | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
99daf8763ab634eb5e912c2e919f79fc9cbf888f |
PoQuaD dataset | arduwa/poquad-imp | [
"task_categories:question-answering",
"task_ids:extractive-qa",
"task_ids:open-domain-qa",
"annotations_creators:expert-generated",
"language_creators:found",
"multilinguality:monolingual",
"size_categories:10K<n<100K",
"source_datasets:original",
"language:pl",
"license:cc-by-4.0",
"region:us"
] | 2024-01-07T19:16:14+00:00 | {"annotations_creators": ["expert-generated"], "language_creators": ["found"], "language": ["pl"], "license": ["cc-by-4.0"], "multilinguality": ["monolingual"], "size_categories": ["10K<n<100K"], "source_datasets": ["original"], "task_categories": ["question-answering"], "task_ids": ["extractive-qa", "open-domain-qa"], "pretty_name": "PoQuaD"} | 2024-01-07T19:49:13+00:00 | [] | [
"pl"
] | TAGS
#task_categories-question-answering #task_ids-extractive-qa #task_ids-open-domain-qa #annotations_creators-expert-generated #language_creators-found #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-original #language-Polish #license-cc-by-4.0 #region-us
|
PoQuaD dataset | [] | [
"TAGS\n#task_categories-question-answering #task_ids-extractive-qa #task_ids-open-domain-qa #annotations_creators-expert-generated #language_creators-found #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-original #language-Polish #license-cc-by-4.0 #region-us \n"
] | [
104
] | [
"passage: TAGS\n#task_categories-question-answering #task_ids-extractive-qa #task_ids-open-domain-qa #annotations_creators-expert-generated #language_creators-found #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-original #language-Polish #license-cc-by-4.0 #region-us \n"
] |
1d2f15f366ce195da733bf00afd5c3dd9e548a4d |
<p align="right">
<a href="https://github.com/argilla-io/distilabel">
<img src="https://raw.githubusercontent.com/argilla-io/distilabel/main/docs/assets/distilabel-badge-light.png" alt="Built with Distilabel" width="200" height="32"/>
</a>
</p>
# distilabel Orca Pairs for DPO
The dataset is a "distilabeled" version of the widely used dataset: [Intel/orca_dpo_pairs](https://huggingface.co/datasets/Intel/orca_dpo_pairs). The original dataset has been used by 100s of open-source practitioners and models. We knew from fixing UltraFeedback (and before that, Alpacas and Dollys) that this dataset could be highly improved.
Continuing with our mission to build the best alignment datasets for open-source LLMs and the community, we spent a few hours improving it with [distilabel](https://github.com/argilla-io/distilabel).
This was our main intuition: the original dataset just assumes gpt4/3.5-turbo are always the best response. We know from UltraFeedback that's not always the case. Moreover, DPO fine-tuning benefits from the diversity of preference pairs.
Additionally, we have added a new column indicating whether the question in the dataset is part of the train set of gsm8k (there were no examples from the test set). See the reproduction section for more details.
## Using this dataset
This dataset is useful for preference tuning and we recommend using it instead of the original. It's already prepared in the "standard" chosen, rejected format with additional information for further filtering and experimentation.
The main changes are:
1. ~2K pairs have been swapped: rejected become the chosen response. We have kept the original chosen and rejected on two new columns `original_*` for reproducibility purposes.
2. 4K pairs have been identified as `tie`: equally bad or good.
3. Chosen scores have been added: you can now filter out based on a threshold (see our distilabeled Hermes 2.5 model for an example)
4. We have kept the ratings and rationales generated with gpt-4-turbo and distilabel so you can prepare the data differently if you want.
5. We have added a column to indicate if the input is part of gsm8k train set.
In our experiments, we have got very good results by reducing the size of the dataset by more than 50%. Here's an example of how to achieve that:
```python
from datasets import load_dataset
# Instead of this:
# dataset = load_dataset("Intel/orca_dpo_pairs", split="train")
# use this:
dataset = load_dataset("argilla/distilabel-intel-orca-dpo-pairs", split="train")
dataset = dataset.filter(
lambda r:
r["status"] != "tie" and
r["chosen_score"] >= 8 and
not r["in_gsm8k_train"]
)
```
This results in `5,922` instead of `12,859` samples (54% reduction) and leads to better performance than the same model tuned with 100% of the samples in the original dataset.
> We'd love to hear about your experiments! If you want to try this out, consider joining our [Slack community](https://join.slack.com/t/rubrixworkspace/shared_invite/zt-whigkyjn-a3IUJLD7gDbTZ0rKlvcJ5g) and let's build some open datasets and models together.
## Reproducing the dataset
In this section, we outline the steps to reproduce this dataset.
### Rate original dataset pairs
Build a preference dataset with distilabel using the original dataset:
```python
from distilabel.llm import OpenAILLM
from distilabel.tasks import JudgeLMTask
from distilabel.pipeline import Pipeline
from datasets import load_dataset
# Shuffle 'chosen' and 'rejected' to avoid positional bias and keep track of the order
def shuffle_and_track(chosen, rejected):
pair = [chosen, rejected]
random.shuffle(pair)
order = ["chosen" if x == chosen else "rejected" for x in pair]
return {"generations": pair, "order": order}
dataset = load_dataset("Intel/orca_dpo_pairs", split="train")
# This shuffles the pairs to mitigate positional bias
dataset = dataset.map(lambda x: shuffle_and_track(x["chosen"], x["rejected"]))
# We use our JudgeLM implementation to rate the original pairs
labeler = OpenAILLM(
task=JudgeLMTask(),
model="gpt-4-1106-preview",
num_threads=16,
max_new_tokens=512,
)
dataset = dataset.rename_columns({"question": "input"})
distipipe = Pipeline(
labeller=labeler
)
# This computes ratings and natural language critiques for each pair
ds = distipipe.generate(dataset=dataset, num_generations=2)
```
If you want to further filter and curate the dataset, you can push the dataset to [Argilla](https://github.com/argilla-io/argilla) as follows:
```python
rg_dataset = ds.to_argilla()
rg_dataset.push_to_argilla(name="your_dataset_name", workspace="your_workspace_name")
```
You get a nice UI with a lot of pre-computed metadata to explore and curate the dataset:

The resulting dataset is now much more useful: we know which response is preferred (by gpt-4-turbo), which ones have low scores, and we even have natural language explanations. But what did we find? Was our intuition confirmed?

The above chart shows the following:
* ~4,000 pairs were given the same rating (a tie).
* ~7,000 pairs were correct according to our AI judge (`unchanged`).
* and ~2,000 times the rejected response was preferred (`swapped`).
Now the next question is: can we build better models with this new knowledge? The answer is the "distilabeled Hermes" model, check it out!
### Post-processing to add useful information
Swap rejected and chosen, and add chosen scores and status:
```python
def add_status(r):
status = "unchanged"
highest_rated_idx = np.argmax(r['rating'])
# Compare to the index of the chosen response
if r['rating']== None or r['rating'][0] == r['rating'][1]:
status = "tie"
elif r['order'][highest_rated_idx] != 'chosen':
status = "swapped"
return {"status": status}
def swap(r):
chosen = r["chosen"]
rejected = r["rejected"]
if r['rating'] is not None:
chosen_score = r['rating'][np.argmax(r['rating'])]
else:
chosen_score = None
if r['status'] == "swapped":
chosen = r["rejected"]
rejected = r["chosen"]
return {
"chosen": chosen,
"rejected": rejected,
"original_chosen": r["chosen"],
"original_rejected": r["rejected"],
"chosen_score": chosen_score
}
updated = ds.map(add_status).map(swap)
```
### gsm8k "decontamination"
The basic approach for finding duplicated examples. We didn't find any from the test sets. We experimented with lower thresholds but below 0.8 they introduced false positives:
```python
import pandas as pd
import nltk
from sklearn.feature_extraction.text import TfidfVectorizer
from sklearn.metrics.pairwise import cosine_similarity
from datasets import load_dataset
nltk.download('punkt')
# Load the datasets
source_dataset = load_dataset("gsm8k", "main", split="train")
source_dataset_socratic = load_dataset("gsm8k", "socratic", split="train")
#target_dataset = load_dataset("Intel/orca_dpo_pairs", split="train")
target_dataset = load_dataset("argilla/distilabel-intel-orca-dpo-pairs", split="train")
# Extract the 'question' column from each dataset
source_questions = source_dataset['question']
source_questions_socratic = source_dataset_socratic['question']
target_questions = target_dataset['input']
# Function to preprocess the text
def preprocess(text):
return nltk.word_tokenize(text.lower())
# Preprocess the questions
source_questions_processed = [preprocess(q) for q in source_questions]
source_questions.extend([preprocess(q) for q in source_questions_socratic])
target_questions_processed = [preprocess(q) for q in target_questions]
# Vectorize the questions
vectorizer = TfidfVectorizer()
source_vec = vectorizer.fit_transform([' '.join(q) for q in source_questions_processed])
target_vec = vectorizer.transform([' '.join(q) for q in target_questions_processed])
# Calculate cosine similarity
similarity_matrix = cosine_similarity(source_vec, target_vec)
# Determine matches based on a threshold:
# checked manually and below 0.8 there are only false positives
threshold = 0.8
matching_pairs = []
for i, row in enumerate(similarity_matrix):
for j, similarity in enumerate(row):
if similarity >= threshold:
matching_pairs.append((source_questions[i], target_questions[j], similarity))
# Create a DataFrame from the matching pairs
df = pd.DataFrame(matching_pairs, columns=['Source Question', 'Target Question', 'Similarity Score'])
# Create a set of matching target questions
matching_target_questions = list(df['Target Question'])
# Add a column to the target dataset indicating whether each question is matched
target_dataset = target_dataset.map(lambda example: {"in_gsm8k_train": example['input'] in matching_target_questions})
```
Result:
```
False 12780
True 79
Name: in_gsm8k_train
``` | argilla/distilabel-intel-orca-dpo-pairs | [
"language:en",
"license:apache-2.0",
"rlaif",
"dpo",
"rlhf",
"distilabel",
"synthetic",
"region:us"
] | 2024-01-07T19:41:53+00:00 | {"language": ["en"], "license": "apache-2.0", "dataset_info": {"features": [{"name": "system", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "generations", "sequence": "string"}, {"name": "order", "sequence": "string"}, {"name": "labelling_model", "dtype": "string"}, {"name": "labelling_prompt", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "raw_labelling_response", "dtype": "string"}, {"name": "rating", "sequence": "float64"}, {"name": "rationale", "dtype": "string"}, {"name": "status", "dtype": "string"}, {"name": "original_chosen", "dtype": "string"}, {"name": "original_rejected", "dtype": "string"}, {"name": "chosen_score", "dtype": "float64"}, {"name": "in_gsm8k_train", "dtype": "bool"}], "splits": [{"name": "train", "num_bytes": 161845559, "num_examples": 12859}], "download_size": 79210071, "dataset_size": 161845559}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["rlaif", "dpo", "rlhf", "distilabel", "synthetic"]} | 2024-02-05T15:35:14+00:00 | [] | [
"en"
] | TAGS
#language-English #license-apache-2.0 #rlaif #dpo #rlhf #distilabel #synthetic #region-us
|
<p align="right">
<a href="URL
<img src="URL alt="Built with Distilabel" width="200" height="32"/>
</a>
</p>
# distilabel Orca Pairs for DPO
The dataset is a "distilabeled" version of the widely used dataset: Intel/orca_dpo_pairs. The original dataset has been used by 100s of open-source practitioners and models. We knew from fixing UltraFeedback (and before that, Alpacas and Dollys) that this dataset could be highly improved.
Continuing with our mission to build the best alignment datasets for open-source LLMs and the community, we spent a few hours improving it with distilabel.
This was our main intuition: the original dataset just assumes gpt4/3.5-turbo are always the best response. We know from UltraFeedback that's not always the case. Moreover, DPO fine-tuning benefits from the diversity of preference pairs.
Additionally, we have added a new column indicating whether the question in the dataset is part of the train set of gsm8k (there were no examples from the test set). See the reproduction section for more details.
## Using this dataset
This dataset is useful for preference tuning and we recommend using it instead of the original. It's already prepared in the "standard" chosen, rejected format with additional information for further filtering and experimentation.
The main changes are:
1. ~2K pairs have been swapped: rejected become the chosen response. We have kept the original chosen and rejected on two new columns 'original_*' for reproducibility purposes.
2. 4K pairs have been identified as 'tie': equally bad or good.
3. Chosen scores have been added: you can now filter out based on a threshold (see our distilabeled Hermes 2.5 model for an example)
4. We have kept the ratings and rationales generated with gpt-4-turbo and distilabel so you can prepare the data differently if you want.
5. We have added a column to indicate if the input is part of gsm8k train set.
In our experiments, we have got very good results by reducing the size of the dataset by more than 50%. Here's an example of how to achieve that:
This results in '5,922' instead of '12,859' samples (54% reduction) and leads to better performance than the same model tuned with 100% of the samples in the original dataset.
> We'd love to hear about your experiments! If you want to try this out, consider joining our Slack community and let's build some open datasets and models together.
## Reproducing the dataset
In this section, we outline the steps to reproduce this dataset.
### Rate original dataset pairs
Build a preference dataset with distilabel using the original dataset:
If you want to further filter and curate the dataset, you can push the dataset to Argilla as follows:
You get a nice UI with a lot of pre-computed metadata to explore and curate the dataset:
!image/png
The resulting dataset is now much more useful: we know which response is preferred (by gpt-4-turbo), which ones have low scores, and we even have natural language explanations. But what did we find? Was our intuition confirmed?
!image/png
The above chart shows the following:
* ~4,000 pairs were given the same rating (a tie).
* ~7,000 pairs were correct according to our AI judge ('unchanged').
* and ~2,000 times the rejected response was preferred ('swapped').
Now the next question is: can we build better models with this new knowledge? The answer is the "distilabeled Hermes" model, check it out!
### Post-processing to add useful information
Swap rejected and chosen, and add chosen scores and status:
### gsm8k "decontamination"
The basic approach for finding duplicated examples. We didn't find any from the test sets. We experimented with lower thresholds but below 0.8 they introduced false positives:
Result:
| [
"# distilabel Orca Pairs for DPO\n\nThe dataset is a \"distilabeled\" version of the widely used dataset: Intel/orca_dpo_pairs. The original dataset has been used by 100s of open-source practitioners and models. We knew from fixing UltraFeedback (and before that, Alpacas and Dollys) that this dataset could be highly improved.\n\nContinuing with our mission to build the best alignment datasets for open-source LLMs and the community, we spent a few hours improving it with distilabel. \n\nThis was our main intuition: the original dataset just assumes gpt4/3.5-turbo are always the best response. We know from UltraFeedback that's not always the case. Moreover, DPO fine-tuning benefits from the diversity of preference pairs. \n\nAdditionally, we have added a new column indicating whether the question in the dataset is part of the train set of gsm8k (there were no examples from the test set). See the reproduction section for more details.",
"## Using this dataset\nThis dataset is useful for preference tuning and we recommend using it instead of the original. It's already prepared in the \"standard\" chosen, rejected format with additional information for further filtering and experimentation. \n\nThe main changes are:\n\n1. ~2K pairs have been swapped: rejected become the chosen response. We have kept the original chosen and rejected on two new columns 'original_*' for reproducibility purposes.\n2. 4K pairs have been identified as 'tie': equally bad or good.\n3. Chosen scores have been added: you can now filter out based on a threshold (see our distilabeled Hermes 2.5 model for an example)\n4. We have kept the ratings and rationales generated with gpt-4-turbo and distilabel so you can prepare the data differently if you want.\n5. We have added a column to indicate if the input is part of gsm8k train set.\n\nIn our experiments, we have got very good results by reducing the size of the dataset by more than 50%. Here's an example of how to achieve that:\n\n\nThis results in '5,922' instead of '12,859' samples (54% reduction) and leads to better performance than the same model tuned with 100% of the samples in the original dataset.\n\n> We'd love to hear about your experiments! If you want to try this out, consider joining our Slack community and let's build some open datasets and models together.",
"## Reproducing the dataset\nIn this section, we outline the steps to reproduce this dataset.",
"### Rate original dataset pairs\n\nBuild a preference dataset with distilabel using the original dataset:\n\n\n\nIf you want to further filter and curate the dataset, you can push the dataset to Argilla as follows:\n\n\nYou get a nice UI with a lot of pre-computed metadata to explore and curate the dataset:\n\n!image/png\n\nThe resulting dataset is now much more useful: we know which response is preferred (by gpt-4-turbo), which ones have low scores, and we even have natural language explanations. But what did we find? Was our intuition confirmed?\n\n!image/png\n\nThe above chart shows the following: \n\n* ~4,000 pairs were given the same rating (a tie).\n* ~7,000 pairs were correct according to our AI judge ('unchanged').\n* and ~2,000 times the rejected response was preferred ('swapped').\n\n\n\nNow the next question is: can we build better models with this new knowledge? The answer is the \"distilabeled Hermes\" model, check it out!",
"### Post-processing to add useful information\n\n\nSwap rejected and chosen, and add chosen scores and status:",
"### gsm8k \"decontamination\"\nThe basic approach for finding duplicated examples. We didn't find any from the test sets. We experimented with lower thresholds but below 0.8 they introduced false positives:\n\nResult:"
] | [
"TAGS\n#language-English #license-apache-2.0 #rlaif #dpo #rlhf #distilabel #synthetic #region-us \n",
"# distilabel Orca Pairs for DPO\n\nThe dataset is a \"distilabeled\" version of the widely used dataset: Intel/orca_dpo_pairs. The original dataset has been used by 100s of open-source practitioners and models. We knew from fixing UltraFeedback (and before that, Alpacas and Dollys) that this dataset could be highly improved.\n\nContinuing with our mission to build the best alignment datasets for open-source LLMs and the community, we spent a few hours improving it with distilabel. \n\nThis was our main intuition: the original dataset just assumes gpt4/3.5-turbo are always the best response. We know from UltraFeedback that's not always the case. Moreover, DPO fine-tuning benefits from the diversity of preference pairs. \n\nAdditionally, we have added a new column indicating whether the question in the dataset is part of the train set of gsm8k (there were no examples from the test set). See the reproduction section for more details.",
"## Using this dataset\nThis dataset is useful for preference tuning and we recommend using it instead of the original. It's already prepared in the \"standard\" chosen, rejected format with additional information for further filtering and experimentation. \n\nThe main changes are:\n\n1. ~2K pairs have been swapped: rejected become the chosen response. We have kept the original chosen and rejected on two new columns 'original_*' for reproducibility purposes.\n2. 4K pairs have been identified as 'tie': equally bad or good.\n3. Chosen scores have been added: you can now filter out based on a threshold (see our distilabeled Hermes 2.5 model for an example)\n4. We have kept the ratings and rationales generated with gpt-4-turbo and distilabel so you can prepare the data differently if you want.\n5. We have added a column to indicate if the input is part of gsm8k train set.\n\nIn our experiments, we have got very good results by reducing the size of the dataset by more than 50%. Here's an example of how to achieve that:\n\n\nThis results in '5,922' instead of '12,859' samples (54% reduction) and leads to better performance than the same model tuned with 100% of the samples in the original dataset.\n\n> We'd love to hear about your experiments! If you want to try this out, consider joining our Slack community and let's build some open datasets and models together.",
"## Reproducing the dataset\nIn this section, we outline the steps to reproduce this dataset.",
"### Rate original dataset pairs\n\nBuild a preference dataset with distilabel using the original dataset:\n\n\n\nIf you want to further filter and curate the dataset, you can push the dataset to Argilla as follows:\n\n\nYou get a nice UI with a lot of pre-computed metadata to explore and curate the dataset:\n\n!image/png\n\nThe resulting dataset is now much more useful: we know which response is preferred (by gpt-4-turbo), which ones have low scores, and we even have natural language explanations. But what did we find? Was our intuition confirmed?\n\n!image/png\n\nThe above chart shows the following: \n\n* ~4,000 pairs were given the same rating (a tie).\n* ~7,000 pairs were correct according to our AI judge ('unchanged').\n* and ~2,000 times the rejected response was preferred ('swapped').\n\n\n\nNow the next question is: can we build better models with this new knowledge? The answer is the \"distilabeled Hermes\" model, check it out!",
"### Post-processing to add useful information\n\n\nSwap rejected and chosen, and add chosen scores and status:",
"### gsm8k \"decontamination\"\nThe basic approach for finding duplicated examples. We didn't find any from the test sets. We experimented with lower thresholds but below 0.8 they introduced false positives:\n\nResult:"
] | [
36,
249,
342,
22,
233,
28,
56
] | [
"passage: TAGS\n#language-English #license-apache-2.0 #rlaif #dpo #rlhf #distilabel #synthetic #region-us \n# distilabel Orca Pairs for DPO\n\nThe dataset is a \"distilabeled\" version of the widely used dataset: Intel/orca_dpo_pairs. The original dataset has been used by 100s of open-source practitioners and models. We knew from fixing UltraFeedback (and before that, Alpacas and Dollys) that this dataset could be highly improved.\n\nContinuing with our mission to build the best alignment datasets for open-source LLMs and the community, we spent a few hours improving it with distilabel. \n\nThis was our main intuition: the original dataset just assumes gpt4/3.5-turbo are always the best response. We know from UltraFeedback that's not always the case. Moreover, DPO fine-tuning benefits from the diversity of preference pairs. \n\nAdditionally, we have added a new column indicating whether the question in the dataset is part of the train set of gsm8k (there were no examples from the test set). See the reproduction section for more details."
] |
57781c923abfd17eda74eabe4dbb023e595ac3b0 |
A RL environment called 3DCarParking for the Godot Game Engine.
This environment was created with: https://github.com/edbeeching/godot_rl_agents
## Downloading the environment
After installing Godot RL Agents, download the environment with:
```
gdrl.env_from_hub -r edbeeching/godot_rl_3DCarParking
```
| edbeeching/godot_rl_3DCarParking | [
"deep-reinforcement-learning",
"reinforcement-learning",
"godot-rl",
"environments",
"video-games",
"region:us"
] | 2024-01-07T20:14:59+00:00 | {"library_name": "godot-rl", "tags": ["deep-reinforcement-learning", "reinforcement-learning", "godot-rl", "environments", "video-games"]} | 2024-01-07T20:15:09+00:00 | [] | [] | TAGS
#deep-reinforcement-learning #reinforcement-learning #godot-rl #environments #video-games #region-us
|
A RL environment called 3DCarParking for the Godot Game Engine.
This environment was created with: URL
## Downloading the environment
After installing Godot RL Agents, download the environment with:
| [
"## Downloading the environment \n\nAfter installing Godot RL Agents, download the environment with:"
] | [
"TAGS\n#deep-reinforcement-learning #reinforcement-learning #godot-rl #environments #video-games #region-us \n",
"## Downloading the environment \n\nAfter installing Godot RL Agents, download the environment with:"
] | [
32,
20
] | [
"passage: TAGS\n#deep-reinforcement-learning #reinforcement-learning #godot-rl #environments #video-games #region-us \n## Downloading the environment \n\nAfter installing Godot RL Agents, download the environment with:"
] |
f5c87ae5a2e7a5106606314eef45255f03151bb3 |
This dataset is derived from the [GermanQuAD](https://www.deepset.ai/germanquad) dataset.
This dataset takes the testset and represents it as a corpus in the [BEIR](https://github.com/beir-cellar/beir) information retrieval benchmark format.
Corpus and query ids have been added.
The corresponding qrels can be found [here](https://huggingface.co/datasets/mteb/germanquad-retrieval-qrels).
Full credit for the original dataset goes to the [authors](https://arxiv.org/abs/2104.12741) of the GermanQuAD [dataset](https://huggingface.co/datasets/deepset/germandpr).
The original dataset is licensed under [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/).
Citation for the original dataset:
```
@misc{möller2021germanquad,
title={GermanQuAD and GermanDPR: Improving Non-English Question Answering and Passage Retrieval},
author={Timo Möller and Julian Risch and Malte Pietsch},
year={2021},
eprint={2104.12741},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
The derived dataset was created by [rasdani](https://hf-proxy-cf.effarig.sitem/rasdani).
| mteb/germanquad-retrieval | [
"source_datasets:deepset/germanquad",
"language:de",
"license:cc-by-4.0",
"arxiv:2104.12741",
"region:us"
] | 2024-01-07T20:17:07+00:00 | {"language": ["de"], "license": "cc-by-4.0", "source_datasets": ["deepset/germanquad"], "configs": [{"config_name": "corpus", "data_files": [{"split": "corpus", "path": "corpus/data-00000-of-00001.arrow"}]}, {"config_name": "queries", "data_files": [{"split": "queries", "path": "queries/data-00000-of-00001.arrow"}]}]} | 2024-01-08T17:53:16+00:00 | [
"2104.12741"
] | [
"de"
] | TAGS
#source_datasets-deepset/germanquad #language-German #license-cc-by-4.0 #arxiv-2104.12741 #region-us
|
This dataset is derived from the GermanQuAD dataset.
This dataset takes the testset and represents it as a corpus in the BEIR information retrieval benchmark format.
Corpus and query ids have been added.
The corresponding qrels can be found here.
Full credit for the original dataset goes to the authors of the GermanQuAD dataset.
The original dataset is licensed under CC BY-SA 4.0.
Citation for the original dataset:
The derived dataset was created by rasdani.
| [] | [
"TAGS\n#source_datasets-deepset/germanquad #language-German #license-cc-by-4.0 #arxiv-2104.12741 #region-us \n"
] | [
40
] | [
"passage: TAGS\n#source_datasets-deepset/germanquad #language-German #license-cc-by-4.0 #arxiv-2104.12741 #region-us \n"
] |
abbd9e68774a6829183398eb657bb42cd8ec1f23 | # Dataset Card for "multi-pooled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gowitheflowlab/multi-pooled | [
"region:us"
] | 2024-01-07T20:28:51+00:00 | {"dataset_info": {"features": [{"name": "sentence1", "dtype": "string"}, {"name": "sentence2", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "entailment", "1": "neutral", "2": "contradiction"}}}}], "splits": [{"name": "train", "num_bytes": 528983104.40889275, "num_examples": 1963485}], "download_size": 290954453, "dataset_size": 528983104.40889275}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-07T20:29:49+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "multi-pooled"
More Information needed | [
"# Dataset Card for \"multi-pooled\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"multi-pooled\"\n\nMore Information needed"
] | [
6,
14
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"multi-pooled\"\n\nMore Information needed"
] |
fbac39708c60a8d841e85acafdae1de32b8dfe17 |
# Dataset Card for intel-orca-dpo-pair-rg
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("dvilasuero/intel-orca-dpo-pair-rg")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("dvilasuero/intel-orca-dpo-pair-rg")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| input | input | text | True | True |
| generations-1 | generations-1 | text | True | True |
| generations-2 | generations-2 | text | True | True |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| generations-1-rating | What's the rating for generations-1? | rating | True | N/A | [1, 2, 3, 4, 5, 6, 7, 8, 9, 10] |
| generations-2-rating | What's the rating for generations-2? | rating | True | N/A | [1, 2, 3, 4, 5, 6, 7, 8, 9, 10] |
| rating-rationale | What's the rationale behind each rating? | text | True | N/A | N/A |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
| length-input | length-input | integer | None - None | True |
| length-generations-1 | length-generations-1 | integer | None - None | True |
| rating-generations-1 | rating-generations-1 | float | None - None | True |
| length-generations-2 | length-generations-2 | integer | None - None | True |
| rating-generations-2 | rating-generations-2 | float | None - None | True |
| distance-best-rating | distance-best-rating | float | None - None | True |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"generations-1": "[\n [\"AFC Ajax (amateurs)\", \"has ground\", \"Sportpark De Toekomst\"],\n [\"Ajax Youth Academy\", \"plays at\", \"Sportpark De Toekomst\"]\n]",
"generations-2": "Sure, I\u0027d be happy to help! Here are the RDF triplets for the input sentence:\n\n[AFC Ajax (amateurs), hasGround, Sportpark De Toekomst]\n[Ajax Youth Academy, playsAt, Sportpark De Toekomst]\n\nExplanation:\n\n* AFC Ajax (amateurs) is the subject of the first triplet, and hasGround is the predicate that describes the relationship between AFC Ajax (amateurs) and Sportpark De Toekomst.\n* Ajax Youth Academy is the subject of the second triplet, and playsAt is the predicate that describes the relationship between Ajax Youth Academy and Sportpark De Toekomst.\n\nNote that there may be other possible RDF triplets that could be derived from the input sentence, but the above triplets capture the main relationships present in the sentence.",
"input": "You will be given a definition of a task first, then some input of the task.\nThis task is about using the specified sentence and converting the sentence to Resource Description Framework (RDF) triplets of the form (subject, predicate object). The RDF triplets generated must be such that the triplets accurately capture the structure and semantics of the input sentence. The input is a sentence and the output is a list of triplets of the form [subject, predicate, object] that capture the relationships present in the sentence. When a sentence has more than 1 RDF triplet possible, the output must contain all of them.\n\nAFC Ajax (amateurs)\u0027s ground is Sportpark De Toekomst where Ajax Youth Academy also play.\nOutput:"
},
"metadata": {
"distance-best-rating": 0.0,
"labelling-model": "gpt-4-1106-preview",
"length-generations-1": 131,
"length-generations-2": 731,
"length-input": 718,
"rating-generations-1": 9.0,
"rating-generations-2": 9.0
},
"responses": [],
"suggestions": [
{
"agent": null,
"question_name": "rating-rationale",
"score": null,
"type": null,
"value": "\nBoth Assistant 1 and Assistant 2 provided correct RDF triplets for the given sentence. Both assistants used a format that accurately represents the relationships present in the sentence with minor differences in the stylistic representation of the predicate.\n\nAssistant 1 used the natural language format for predicates, stating \"has ground\" and \"plays at\", which clearly aligns with the typical RDF representation where the predicate tries to be a URI that is more formal. However, since the task prompt doesn\u0027t specify a requirement for the predicates to be in URI form, this representation is acceptable, especially considering human readability.\n\nAssistant 2 transformed the predicates into a more formal-looking format by using camel case (hasGround, playsAt), which may suggest a transition towards a URI, although they are not provided as such. This is a common practice when designing RDF predicates, intending to align with web standards, although again, the task did not explicitly require this form.\n\nBoth assistants explained the relationships captured by the triplets, which is helpful for understanding how the RDF structure relates to the original sentence. There are no factual inaccuracies in either output, and both sets of triplets are equivalent in terms of the information they represent.\n\nOverall, the level of detail was similar, with each assistant providing a brief explanation following their respective triplets. Neither output contained unnecessary or irrelevant information, and no critical information was missing. \n\nBoth assistants would have received a score of 10 if the predicates were provided in a format that hinted at being URIs (e.g., prefixed with a namespace or in a full URI format), which is the more standard and formal practice for RDF predicates. Nevertheless, the assistants\u0027 performance was high given the context of the question, which did not specify this requirement. Therefore, both receive a score of 9."
},
{
"agent": null,
"question_name": "generations-1-rating",
"score": null,
"type": null,
"value": 9
},
{
"agent": null,
"question_name": "generations-2-rating",
"score": null,
"type": null,
"value": 9
}
],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"generations-1": "[\n [\"AFC Ajax (amateurs)\", \"has ground\", \"Sportpark De Toekomst\"],\n [\"Ajax Youth Academy\", \"plays at\", \"Sportpark De Toekomst\"]\n]",
"generations-1-rating": [],
"generations-1-rating-suggestion": 9,
"generations-1-rating-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"generations-2": "Sure, I\u0027d be happy to help! Here are the RDF triplets for the input sentence:\n\n[AFC Ajax (amateurs), hasGround, Sportpark De Toekomst]\n[Ajax Youth Academy, playsAt, Sportpark De Toekomst]\n\nExplanation:\n\n* AFC Ajax (amateurs) is the subject of the first triplet, and hasGround is the predicate that describes the relationship between AFC Ajax (amateurs) and Sportpark De Toekomst.\n* Ajax Youth Academy is the subject of the second triplet, and playsAt is the predicate that describes the relationship between Ajax Youth Academy and Sportpark De Toekomst.\n\nNote that there may be other possible RDF triplets that could be derived from the input sentence, but the above triplets capture the main relationships present in the sentence.",
"generations-2-rating": [],
"generations-2-rating-suggestion": 9,
"generations-2-rating-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"input": "You will be given a definition of a task first, then some input of the task.\nThis task is about using the specified sentence and converting the sentence to Resource Description Framework (RDF) triplets of the form (subject, predicate object). The RDF triplets generated must be such that the triplets accurately capture the structure and semantics of the input sentence. The input is a sentence and the output is a list of triplets of the form [subject, predicate, object] that capture the relationships present in the sentence. When a sentence has more than 1 RDF triplet possible, the output must contain all of them.\n\nAFC Ajax (amateurs)\u0027s ground is Sportpark De Toekomst where Ajax Youth Academy also play.\nOutput:",
"metadata": "{\"length-input\": 718, \"length-generations-1\": 131, \"length-generations-2\": 731, \"rating-generations-1\": 9.0, \"rating-generations-2\": 9.0, \"distance-best-rating\": 0.0, \"labelling-model\": \"gpt-4-1106-preview\"}",
"rating-rationale": [],
"rating-rationale-suggestion": "\nBoth Assistant 1 and Assistant 2 provided correct RDF triplets for the given sentence. Both assistants used a format that accurately represents the relationships present in the sentence with minor differences in the stylistic representation of the predicate.\n\nAssistant 1 used the natural language format for predicates, stating \"has ground\" and \"plays at\", which clearly aligns with the typical RDF representation where the predicate tries to be a URI that is more formal. However, since the task prompt doesn\u0027t specify a requirement for the predicates to be in URI form, this representation is acceptable, especially considering human readability.\n\nAssistant 2 transformed the predicates into a more formal-looking format by using camel case (hasGround, playsAt), which may suggest a transition towards a URI, although they are not provided as such. This is a common practice when designing RDF predicates, intending to align with web standards, although again, the task did not explicitly require this form.\n\nBoth assistants explained the relationships captured by the triplets, which is helpful for understanding how the RDF structure relates to the original sentence. There are no factual inaccuracies in either output, and both sets of triplets are equivalent in terms of the information they represent.\n\nOverall, the level of detail was similar, with each assistant providing a brief explanation following their respective triplets. Neither output contained unnecessary or irrelevant information, and no critical information was missing. \n\nBoth assistants would have received a score of 10 if the predicates were provided in a format that hinted at being URIs (e.g., prefixed with a namespace or in a full URI format), which is the more standard and formal practice for RDF predicates. Nevertheless, the assistants\u0027 performance was high given the context of the question, which did not specify this requirement. Therefore, both receive a score of 9.",
"rating-rationale-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **input** is of type `text`.
* **generations-1** is of type `text`.
* **generations-2** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **generations-1-rating** is of type `rating` with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].
* **generations-2-rating** is of type `rating` with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].
* **rating-rationale** is of type `text`.
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **generations-1-rating-suggestion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].
* (optional) **generations-2-rating-suggestion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].
* (optional) **rating-rationale-suggestion** is of type `text`.
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | dvilasuero/intel-orca-dpo-pair-rg | [
"size_categories:10K<n<100K",
"rlfh",
"argilla",
"human-feedback",
"region:us"
] | 2024-01-07T20:28:54+00:00 | {"size_categories": "10K<n<100K", "tags": ["rlfh", "argilla", "human-feedback"]} | 2024-01-07T20:28:57+00:00 | [] | [] | TAGS
#size_categories-10K<n<100K #rlfh #argilla #human-feedback #region-us
| Dataset Card for intel-orca-dpo-pair-rg
=======================================
This dataset has been created with Argilla.
As shown in the sections below, this dataset can be loaded into Argilla as explained in Load with Argilla, or used directly with the 'datasets' library in Load with 'datasets'.
Dataset Description
-------------------
* Homepage: URL
* Repository: URL
* Paper:
* Leaderboard:
* Point of Contact:
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named 'URL'. This configuration file will be used to configure the dataset when using the 'FeedbackDataset.from\_huggingface' method in Argilla.
* Dataset records in a format compatible with HuggingFace 'datasets'. These records will be loaded automatically when using 'FeedbackDataset.from\_huggingface' and can be loaded independently using the 'datasets' library via 'load\_dataset'.
* The annotation guidelines that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as 'pip install argilla --upgrade' and then use the following code:
### Load with 'datasets'
To load this dataset with 'datasets', you'll just need to install 'datasets' as 'pip install datasets --upgrade' and then use the following code:
### Supported Tasks and Leaderboards
This dataset can contain multiple fields, questions and responses so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the Dataset Structure section.
There are no leaderboards associated with this dataset.
### Languages
Dataset Structure
-----------------
### Data in Argilla
The dataset is created in Argilla with: fields, questions, suggestions, metadata, vectors, and guidelines.
The fields are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
The questions are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label\_selection, multi\_label\_selection, or ranking.
The suggestions are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The metadata is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the 'metadata\_properties' defined in the dataset configuration file in 'URL'.
The guidelines, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the annotation guidelines section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
While the same record in HuggingFace 'datasets' looks as follows:
### Data Fields
Among the dataset fields, we differentiate between the following:
* Fields: These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
+ input is of type 'text'.
+ generations-1 is of type 'text'.
+ generations-2 is of type 'text'.
* Questions: These are the questions that will be asked to the annotators. They can be of different types, such as 'RatingQuestion', 'TextQuestion', 'LabelQuestion', 'MultiLabelQuestion', and 'RankingQuestion'.
+ generations-1-rating is of type 'rating' with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].
+ generations-2-rating is of type 'rating' with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].
+ rating-rationale is of type 'text'.
* Suggestions: As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
+ (optional) generations-1-rating-suggestion is of type 'rating' with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].
+ (optional) generations-2-rating-suggestion is of type 'rating' with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].
+ (optional) rating-rationale-suggestion is of type 'text'.
Additionally, we also have two more fields that are optional and are the following:
* metadata: This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the 'metadata\_properties' defined in the dataset configuration file in 'URL'.
* external\_id: This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is 'train'.
Dataset Creation
----------------
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation guidelines
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
Considerations for Using the Data
---------------------------------
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
Additional Information
----------------------
### Dataset Curators
### Licensing Information
### Contributions
| [
"### Dataset Summary\n\n\nThis dataset contains:\n\n\n* A dataset configuration file conforming to the Argilla dataset format named 'URL'. This configuration file will be used to configure the dataset when using the 'FeedbackDataset.from\\_huggingface' method in Argilla.\n* Dataset records in a format compatible with HuggingFace 'datasets'. These records will be loaded automatically when using 'FeedbackDataset.from\\_huggingface' and can be loaded independently using the 'datasets' library via 'load\\_dataset'.\n* The annotation guidelines that have been used for building and curating the dataset, if they've been defined in Argilla.",
"### Load with Argilla\n\n\nTo load with Argilla, you'll just need to install Argilla as 'pip install argilla --upgrade' and then use the following code:",
"### Load with 'datasets'\n\n\nTo load this dataset with 'datasets', you'll just need to install 'datasets' as 'pip install datasets --upgrade' and then use the following code:",
"### Supported Tasks and Leaderboards\n\n\nThis dataset can contain multiple fields, questions and responses so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the Dataset Structure section.\n\n\nThere are no leaderboards associated with this dataset.",
"### Languages\n\n\nDataset Structure\n-----------------",
"### Data in Argilla\n\n\nThe dataset is created in Argilla with: fields, questions, suggestions, metadata, vectors, and guidelines.\n\n\nThe fields are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.\n\n\n\nThe questions are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label\\_selection, multi\\_label\\_selection, or ranking.\n\n\n\nThe suggestions are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending \"-suggestion\" and \"-suggestion-metadata\" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with \"-suggestion\" and the metadata is appended with \"-suggestion-metadata\".\n\n\nThe metadata is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the 'metadata\\_properties' defined in the dataset configuration file in 'URL'.\n\n\n\nThe guidelines, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the annotation guidelines section.",
"### Data Instances\n\n\nAn example of a dataset instance in Argilla looks as follows:\n\n\nWhile the same record in HuggingFace 'datasets' looks as follows:",
"### Data Fields\n\n\nAmong the dataset fields, we differentiate between the following:\n\n\n* Fields: These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.\n\n\n\t+ input is of type 'text'.\n\t+ generations-1 is of type 'text'.\n\t+ generations-2 is of type 'text'.\n* Questions: These are the questions that will be asked to the annotators. They can be of different types, such as 'RatingQuestion', 'TextQuestion', 'LabelQuestion', 'MultiLabelQuestion', and 'RankingQuestion'.\n\n\n\t+ generations-1-rating is of type 'rating' with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].\n\t+ generations-2-rating is of type 'rating' with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].\n\t+ rating-rationale is of type 'text'.\n* Suggestions: As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.\n\n\n\t+ (optional) generations-1-rating-suggestion is of type 'rating' with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].\n\t+ (optional) generations-2-rating-suggestion is of type 'rating' with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].\n\t+ (optional) rating-rationale-suggestion is of type 'text'.\n\n\nAdditionally, we also have two more fields that are optional and are the following:\n\n\n* metadata: This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the 'metadata\\_properties' defined in the dataset configuration file in 'URL'.\n* external\\_id: This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.",
"### Data Splits\n\n\nThe dataset contains a single split, which is 'train'.\n\n\nDataset Creation\n----------------",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation guidelines",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information\n\n\nConsiderations for Using the Data\n---------------------------------",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations\n\n\nAdditional Information\n----------------------",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] | [
"TAGS\n#size_categories-10K<n<100K #rlfh #argilla #human-feedback #region-us \n",
"### Dataset Summary\n\n\nThis dataset contains:\n\n\n* A dataset configuration file conforming to the Argilla dataset format named 'URL'. This configuration file will be used to configure the dataset when using the 'FeedbackDataset.from\\_huggingface' method in Argilla.\n* Dataset records in a format compatible with HuggingFace 'datasets'. These records will be loaded automatically when using 'FeedbackDataset.from\\_huggingface' and can be loaded independently using the 'datasets' library via 'load\\_dataset'.\n* The annotation guidelines that have been used for building and curating the dataset, if they've been defined in Argilla.",
"### Load with Argilla\n\n\nTo load with Argilla, you'll just need to install Argilla as 'pip install argilla --upgrade' and then use the following code:",
"### Load with 'datasets'\n\n\nTo load this dataset with 'datasets', you'll just need to install 'datasets' as 'pip install datasets --upgrade' and then use the following code:",
"### Supported Tasks and Leaderboards\n\n\nThis dataset can contain multiple fields, questions and responses so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the Dataset Structure section.\n\n\nThere are no leaderboards associated with this dataset.",
"### Languages\n\n\nDataset Structure\n-----------------",
"### Data in Argilla\n\n\nThe dataset is created in Argilla with: fields, questions, suggestions, metadata, vectors, and guidelines.\n\n\nThe fields are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.\n\n\n\nThe questions are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label\\_selection, multi\\_label\\_selection, or ranking.\n\n\n\nThe suggestions are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending \"-suggestion\" and \"-suggestion-metadata\" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with \"-suggestion\" and the metadata is appended with \"-suggestion-metadata\".\n\n\nThe metadata is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the 'metadata\\_properties' defined in the dataset configuration file in 'URL'.\n\n\n\nThe guidelines, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the annotation guidelines section.",
"### Data Instances\n\n\nAn example of a dataset instance in Argilla looks as follows:\n\n\nWhile the same record in HuggingFace 'datasets' looks as follows:",
"### Data Fields\n\n\nAmong the dataset fields, we differentiate between the following:\n\n\n* Fields: These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.\n\n\n\t+ input is of type 'text'.\n\t+ generations-1 is of type 'text'.\n\t+ generations-2 is of type 'text'.\n* Questions: These are the questions that will be asked to the annotators. They can be of different types, such as 'RatingQuestion', 'TextQuestion', 'LabelQuestion', 'MultiLabelQuestion', and 'RankingQuestion'.\n\n\n\t+ generations-1-rating is of type 'rating' with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].\n\t+ generations-2-rating is of type 'rating' with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].\n\t+ rating-rationale is of type 'text'.\n* Suggestions: As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.\n\n\n\t+ (optional) generations-1-rating-suggestion is of type 'rating' with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].\n\t+ (optional) generations-2-rating-suggestion is of type 'rating' with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].\n\t+ (optional) rating-rationale-suggestion is of type 'text'.\n\n\nAdditionally, we also have two more fields that are optional and are the following:\n\n\n* metadata: This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the 'metadata\\_properties' defined in the dataset configuration file in 'URL'.\n* external\\_id: This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.",
"### Data Splits\n\n\nThe dataset contains a single split, which is 'train'.\n\n\nDataset Creation\n----------------",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation guidelines",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information\n\n\nConsiderations for Using the Data\n---------------------------------",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations\n\n\nAdditional Information\n----------------------",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] | [
29,
162,
40,
53,
68,
11,
404,
40,
632,
27,
7,
4,
10,
10,
5,
5,
5,
9,
18,
7,
8,
14,
6,
6,
5
] | [
"passage: TAGS\n#size_categories-10K<n<100K #rlfh #argilla #human-feedback #region-us \n### Dataset Summary\n\n\nThis dataset contains:\n\n\n* A dataset configuration file conforming to the Argilla dataset format named 'URL'. This configuration file will be used to configure the dataset when using the 'FeedbackDataset.from\\_huggingface' method in Argilla.\n* Dataset records in a format compatible with HuggingFace 'datasets'. These records will be loaded automatically when using 'FeedbackDataset.from\\_huggingface' and can be loaded independently using the 'datasets' library via 'load\\_dataset'.\n* The annotation guidelines that have been used for building and curating the dataset, if they've been defined in Argilla.### Load with Argilla\n\n\nTo load with Argilla, you'll just need to install Argilla as 'pip install argilla --upgrade' and then use the following code:### Load with 'datasets'\n\n\nTo load this dataset with 'datasets', you'll just need to install 'datasets' as 'pip install datasets --upgrade' and then use the following code:### Supported Tasks and Leaderboards\n\n\nThis dataset can contain multiple fields, questions and responses so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the Dataset Structure section.\n\n\nThere are no leaderboards associated with this dataset.### Languages\n\n\nDataset Structure\n-----------------",
"passage: ### Data in Argilla\n\n\nThe dataset is created in Argilla with: fields, questions, suggestions, metadata, vectors, and guidelines.\n\n\nThe fields are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.\n\n\n\nThe questions are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label\\_selection, multi\\_label\\_selection, or ranking.\n\n\n\nThe suggestions are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending \"-suggestion\" and \"-suggestion-metadata\" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with \"-suggestion\" and the metadata is appended with \"-suggestion-metadata\".\n\n\nThe metadata is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the 'metadata\\_properties' defined in the dataset configuration file in 'URL'.\n\n\n\nThe guidelines, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the annotation guidelines section.### Data Instances\n\n\nAn example of a dataset instance in Argilla looks as follows:\n\n\nWhile the same record in HuggingFace 'datasets' looks as follows:"
] |
535ea1a4c22cb63a4b4b1fe79bf9f4c6d04cd4ec | # OSCAR EU 6x3M Dataset
## Overview
The OSCAR EU 6x3M dataset is a carefully curated subset of the larger OSCAR corpus, specifically focusing on the main European languages. This dataset includes a balanced representation of six languages: English (en), German (de), Spanish (es), Italian (it), French (fr), and Russian (ru). The "6x3M" in the name signifies that each language is represented with approximately 3 million randomly sampled documents, providing a comprehensive and diverse linguistic resource.
## Dataset Description
- **Languages Included**: English, German, Spanish, Italian, French, Russian
- **Number of Documents**: Approximately 18 million (3 million per language)
- **Data Source**: The dataset is derived from the OSCAR corpus, a large multilingual corpus created from the Common Crawl.
## Use Cases
This dataset is ideal for a variety of natural language processing applications, including but not limited to:
- Multilingual language modeling
- Cross-linguistic transfer learning
- Language identification and classification
- Comparative linguistic studies
## Accessing the Dataset
The dataset is available through the HuggingFace Datasets library. You can load the dataset using the following code snippet:
```python
from datasets import load_dataset
dataset = load_dataset("oscar_eu_6x3M")
| tamedai/oscar_eu_6x3M | [
"region:us"
] | 2024-01-07T20:34:38+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "language", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 60314308571, "num_examples": 18000000}], "download_size": 34795421825, "dataset_size": 60314308571}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-08T13:48:34+00:00 | [] | [] | TAGS
#region-us
| # OSCAR EU 6x3M Dataset
## Overview
The OSCAR EU 6x3M dataset is a carefully curated subset of the larger OSCAR corpus, specifically focusing on the main European languages. This dataset includes a balanced representation of six languages: English (en), German (de), Spanish (es), Italian (it), French (fr), and Russian (ru). The "6x3M" in the name signifies that each language is represented with approximately 3 million randomly sampled documents, providing a comprehensive and diverse linguistic resource.
## Dataset Description
- Languages Included: English, German, Spanish, Italian, French, Russian
- Number of Documents: Approximately 18 million (3 million per language)
- Data Source: The dataset is derived from the OSCAR corpus, a large multilingual corpus created from the Common Crawl.
## Use Cases
This dataset is ideal for a variety of natural language processing applications, including but not limited to:
- Multilingual language modeling
- Cross-linguistic transfer learning
- Language identification and classification
- Comparative linguistic studies
## Accessing the Dataset
The dataset is available through the HuggingFace Datasets library. You can load the dataset using the following code snippet:
'''python
from datasets import load_dataset
dataset = load_dataset("oscar_eu_6x3M")
| [
"# OSCAR EU 6x3M Dataset",
"## Overview\nThe OSCAR EU 6x3M dataset is a carefully curated subset of the larger OSCAR corpus, specifically focusing on the main European languages. This dataset includes a balanced representation of six languages: English (en), German (de), Spanish (es), Italian (it), French (fr), and Russian (ru). The \"6x3M\" in the name signifies that each language is represented with approximately 3 million randomly sampled documents, providing a comprehensive and diverse linguistic resource.",
"## Dataset Description\n- Languages Included: English, German, Spanish, Italian, French, Russian\n- Number of Documents: Approximately 18 million (3 million per language)\n- Data Source: The dataset is derived from the OSCAR corpus, a large multilingual corpus created from the Common Crawl.",
"## Use Cases\nThis dataset is ideal for a variety of natural language processing applications, including but not limited to:\n- Multilingual language modeling\n- Cross-linguistic transfer learning\n- Language identification and classification\n- Comparative linguistic studies",
"## Accessing the Dataset\nThe dataset is available through the HuggingFace Datasets library. You can load the dataset using the following code snippet:\n'''python\nfrom datasets import load_dataset\n\ndataset = load_dataset(\"oscar_eu_6x3M\")"
] | [
"TAGS\n#region-us \n",
"# OSCAR EU 6x3M Dataset",
"## Overview\nThe OSCAR EU 6x3M dataset is a carefully curated subset of the larger OSCAR corpus, specifically focusing on the main European languages. This dataset includes a balanced representation of six languages: English (en), German (de), Spanish (es), Italian (it), French (fr), and Russian (ru). The \"6x3M\" in the name signifies that each language is represented with approximately 3 million randomly sampled documents, providing a comprehensive and diverse linguistic resource.",
"## Dataset Description\n- Languages Included: English, German, Spanish, Italian, French, Russian\n- Number of Documents: Approximately 18 million (3 million per language)\n- Data Source: The dataset is derived from the OSCAR corpus, a large multilingual corpus created from the Common Crawl.",
"## Use Cases\nThis dataset is ideal for a variety of natural language processing applications, including but not limited to:\n- Multilingual language modeling\n- Cross-linguistic transfer learning\n- Language identification and classification\n- Comparative linguistic studies",
"## Accessing the Dataset\nThe dataset is available through the HuggingFace Datasets library. You can load the dataset using the following code snippet:\n'''python\nfrom datasets import load_dataset\n\ndataset = load_dataset(\"oscar_eu_6x3M\")"
] | [
6,
10,
113,
68,
52,
69
] | [
"passage: TAGS\n#region-us \n# OSCAR EU 6x3M Dataset## Overview\nThe OSCAR EU 6x3M dataset is a carefully curated subset of the larger OSCAR corpus, specifically focusing on the main European languages. This dataset includes a balanced representation of six languages: English (en), German (de), Spanish (es), Italian (it), French (fr), and Russian (ru). The \"6x3M\" in the name signifies that each language is represented with approximately 3 million randomly sampled documents, providing a comprehensive and diverse linguistic resource.## Dataset Description\n- Languages Included: English, German, Spanish, Italian, French, Russian\n- Number of Documents: Approximately 18 million (3 million per language)\n- Data Source: The dataset is derived from the OSCAR corpus, a large multilingual corpus created from the Common Crawl.## Use Cases\nThis dataset is ideal for a variety of natural language processing applications, including but not limited to:\n- Multilingual language modeling\n- Cross-linguistic transfer learning\n- Language identification and classification\n- Comparative linguistic studies## Accessing the Dataset\nThe dataset is available through the HuggingFace Datasets library. You can load the dataset using the following code snippet:\n'''python\nfrom datasets import load_dataset\n\ndataset = load_dataset(\"oscar_eu_6x3M\")"
] |
cb43e00bfa15d4d5cf550516ccdb055367e60044 |
# Big Hard Negatives Dataset
A dataset for training embedding models for semantic search.
TODO: add desc
A dataset in a [nixietune](https://github.com/nixiesearch/nixietune) compatible format:
```json
{
"query": ")what was the immediate impact of the success of the manhattan project?",
"pos": [
"The presence of communication amid scientific minds was equally important to the success of the Manhattan Project as scientific intellect was. The only cloud hanging over the impressive achievement of the atomic researchers and engineers is what their success truly meant; hundreds of thousands of innocent lives obliterated."
],
"neg": [
"Abstract. The pivotal engineering and scientific success of the Twentieth century was the Manhattan Project. The Manhattan Project assimilated concepts and leaders from all scientific fields and engineering disciplines to construct the first two atomic bombs.",
"The pivotal engineering and scientific success of the Twentieth century was the Manhattan Project. The Manhattan Project assimilated concepts and leaders from all scientific fields and engineering disciplines to construct the first two atomic bombs."
]
}
```
## Usage
To use with HF datasets:
```bash
pip install datasets zstandard
```
```python
from datasets import load_dataset
data = load_dataset('nixiesearch/bfhnd-small')
print(data["train"].features)
```
## License
Apache 2.0 | nixiesearch/bfhnd | [
"task_categories:sentence-similarity",
"size_categories:100K<n<1M",
"source_datasets:BeIR",
"language:en",
"license:apache-2.0",
"text",
"region:us"
] | 2024-01-07T20:36:04+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["100K<n<1M"], "source_datasets": ["BeIR"], "task_categories": ["sentence-similarity"], "pretty_name": "BFHND: Big Hard Negatives Dataset", "tags": ["text"], "dataset_info": {"config_name": "default", "features": [{"name": "query", "dtype": "string"}, {"name": "positive", "sequence": "string"}, {"name": "negative", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 221539473625, "num_examples": 7240617}]}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train/*"}]}], "train-eval-index": [{"config": "default", "task": "sentence-similarity", "splits": {"train_split": "train"}}]} | 2024-01-09T09:06:39+00:00 | [] | [
"en"
] | TAGS
#task_categories-sentence-similarity #size_categories-100K<n<1M #source_datasets-BeIR #language-English #license-apache-2.0 #text #region-us
|
# Big Hard Negatives Dataset
A dataset for training embedding models for semantic search.
TODO: add desc
A dataset in a nixietune compatible format:
## Usage
To use with HF datasets:
## License
Apache 2.0 | [
"# Big Hard Negatives Dataset\n\nA dataset for training embedding models for semantic search.\n\nTODO: add desc\n\nA dataset in a nixietune compatible format:",
"## Usage\n\nTo use with HF datasets:",
"## License\n\nApache 2.0"
] | [
"TAGS\n#task_categories-sentence-similarity #size_categories-100K<n<1M #source_datasets-BeIR #language-English #license-apache-2.0 #text #region-us \n",
"# Big Hard Negatives Dataset\n\nA dataset for training embedding models for semantic search.\n\nTODO: add desc\n\nA dataset in a nixietune compatible format:",
"## Usage\n\nTo use with HF datasets:",
"## License\n\nApache 2.0"
] | [
54,
38,
12,
5
] | [
"passage: TAGS\n#task_categories-sentence-similarity #size_categories-100K<n<1M #source_datasets-BeIR #language-English #license-apache-2.0 #text #region-us \n# Big Hard Negatives Dataset\n\nA dataset for training embedding models for semantic search.\n\nTODO: add desc\n\nA dataset in a nixietune compatible format:## Usage\n\nTo use with HF datasets:## License\n\nApache 2.0"
] |
24a00717d0c00fa5ae864f6a93776f9ec0276c0e |
# Dataset Card for Evaluation run of Locutusque/TinyMistral-248M-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/TinyMistral-248M-v2](https://huggingface.co/Locutusque/TinyMistral-248M-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-07T20:59:32.750418](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2/blob/main/results_2024-01-07T20-59-32.750418.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23444137768381748,
"acc_stderr": 0.030036991331001676,
"acc_norm": 0.23411204926810086,
"acc_norm_stderr": 0.030827397503827552,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766365,
"mc2": 0.49601663232017196,
"mc2_stderr": 0.01564731250181349
},
"harness|arc:challenge|25": {
"acc": 0.18600682593856654,
"acc_stderr": 0.011370940183266738,
"acc_norm": 0.21245733788395904,
"acc_norm_stderr": 0.01195348290658295
},
"harness|hellaswag|10": {
"acc": 0.26180043815972914,
"acc_stderr": 0.004387161203087972,
"acc_norm": 0.26558454491137223,
"acc_norm_stderr": 0.004407413723383408
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.20754716981132076,
"acc_stderr": 0.024959918028911274,
"acc_norm": 0.20754716981132076,
"acc_norm_stderr": 0.024959918028911274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198816,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198816
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.021132859182754454,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.021132859182754454
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.19032258064516128,
"acc_stderr": 0.022331707611823085,
"acc_norm": 0.19032258064516128,
"acc_norm_stderr": 0.022331707611823085
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.16748768472906403,
"acc_stderr": 0.026273086047535428,
"acc_norm": 0.16748768472906403,
"acc_norm_stderr": 0.026273086047535428
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603488,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603488
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20202020202020202,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.20202020202020202,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.18652849740932642,
"acc_stderr": 0.028112091210117447,
"acc_norm": 0.18652849740932642,
"acc_norm_stderr": 0.028112091210117447
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.02093244577446318,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.02093244577446318
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.025040443877000683,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.025040443877000683
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868966,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868966
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2018348623853211,
"acc_stderr": 0.017208579357787565,
"acc_norm": 0.2018348623853211,
"acc_norm_stderr": 0.017208579357787565
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1712962962962963,
"acc_stderr": 0.025695341643824688,
"acc_norm": 0.1712962962962963,
"acc_norm_stderr": 0.025695341643824688
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.029312814153955917,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.029312814153955917
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3094170403587444,
"acc_stderr": 0.031024411740572206,
"acc_norm": 0.3094170403587444,
"acc_norm_stderr": 0.031024411740572206
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841043,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841043
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.29914529914529914,
"acc_stderr": 0.029996951858349497,
"acc_norm": 0.29914529914529914,
"acc_norm_stderr": 0.029996951858349497
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23499361430395913,
"acc_stderr": 0.015162024152278434,
"acc_norm": 0.23499361430395913,
"acc_norm_stderr": 0.015162024152278434
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.02298959254312357,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.02298959254312357
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.024170840879341016,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.024170840879341016
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.20679012345679013,
"acc_stderr": 0.02253500670594282,
"acc_norm": 0.20679012345679013,
"acc_norm_stderr": 0.02253500670594282
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.0258921511567094,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.0258921511567094
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.010966507972178475,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.010966507972178475
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.024398192986654924,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.024398192986654924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.017740899509177788,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.017740899509177788
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19183673469387755,
"acc_stderr": 0.025206963154225395,
"acc_norm": 0.19183673469387755,
"acc_norm_stderr": 0.025206963154225395
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30994152046783624,
"acc_stderr": 0.03546976959393163,
"acc_norm": 0.30994152046783624,
"acc_norm_stderr": 0.03546976959393163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766365,
"mc2": 0.49601663232017196,
"mc2_stderr": 0.01564731250181349
},
"harness|winogrande|5": {
"acc": 0.5185477505919495,
"acc_stderr": 0.014042813708888378
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2 | [
"region:us"
] | 2024-01-07T21:01:49+00:00 | {"pretty_name": "Evaluation run of Locutusque/TinyMistral-248M-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Locutusque/TinyMistral-248M-v2](https://huggingface.co/Locutusque/TinyMistral-248M-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-07T20:59:32.750418](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2/blob/main/results_2024-01-07T20-59-32.750418.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23444137768381748,\n \"acc_stderr\": 0.030036991331001676,\n \"acc_norm\": 0.23411204926810086,\n \"acc_norm_stderr\": 0.030827397503827552,\n \"mc1\": 0.2607099143206854,\n \"mc1_stderr\": 0.015368841620766365,\n \"mc2\": 0.49601663232017196,\n \"mc2_stderr\": 0.01564731250181349\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.18600682593856654,\n \"acc_stderr\": 0.011370940183266738,\n \"acc_norm\": 0.21245733788395904,\n \"acc_norm_stderr\": 0.01195348290658295\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26180043815972914,\n \"acc_stderr\": 0.004387161203087972,\n \"acc_norm\": 0.26558454491137223,\n \"acc_norm_stderr\": 0.004407413723383408\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.20754716981132076,\n \"acc_stderr\": 0.024959918028911274,\n \"acc_norm\": 0.20754716981132076,\n \"acc_norm_stderr\": 0.024959918028911274\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n \"acc_stderr\": 0.030631145539198816,\n \"acc_norm\": 0.2023121387283237,\n \"acc_norm_stderr\": 0.030631145539198816\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.021132859182754454,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.021132859182754454\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.19032258064516128,\n \"acc_stderr\": 0.022331707611823085,\n \"acc_norm\": 0.19032258064516128,\n \"acc_norm_stderr\": 0.022331707611823085\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.16748768472906403,\n \"acc_stderr\": 0.026273086047535428,\n \"acc_norm\": 0.16748768472906403,\n \"acc_norm_stderr\": 0.026273086047535428\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603488,\n \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603488\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.20202020202020202,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.20202020202020202,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.18652849740932642,\n \"acc_stderr\": 0.028112091210117447,\n \"acc_norm\": 0.18652849740932642,\n \"acc_norm_stderr\": 0.028112091210117447\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.02093244577446318,\n \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.02093244577446318\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.21481481481481482,\n \"acc_stderr\": 0.025040443877000683,\n \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.025040443877000683\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868966,\n \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868966\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.2018348623853211,\n \"acc_stderr\": 0.017208579357787565,\n \"acc_norm\": 0.2018348623853211,\n \"acc_norm_stderr\": 0.017208579357787565\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1712962962962963,\n \"acc_stderr\": 0.025695341643824688,\n \"acc_norm\": 0.1712962962962963,\n \"acc_norm_stderr\": 0.025695341643824688\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.029312814153955917,\n \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.029312814153955917\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3094170403587444,\n \"acc_stderr\": 0.031024411740572206,\n \"acc_norm\": 0.3094170403587444,\n \"acc_norm_stderr\": 0.031024411740572206\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841043,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841043\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.29914529914529914,\n \"acc_stderr\": 0.029996951858349497,\n \"acc_norm\": 0.29914529914529914,\n \"acc_norm_stderr\": 0.029996951858349497\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23499361430395913,\n \"acc_stderr\": 0.015162024152278434,\n \"acc_norm\": 0.23499361430395913,\n \"acc_norm_stderr\": 0.015162024152278434\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.02298959254312357,\n \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.02298959254312357\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.024170840879341016,\n \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.024170840879341016\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.20679012345679013,\n \"acc_stderr\": 0.02253500670594282,\n \"acc_norm\": 0.20679012345679013,\n \"acc_norm_stderr\": 0.02253500670594282\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25177304964539005,\n \"acc_stderr\": 0.0258921511567094,\n \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.0258921511567094\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n \"acc_stderr\": 0.010966507972178475,\n \"acc_norm\": 0.2438070404172099,\n \"acc_norm_stderr\": 0.010966507972178475\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.024398192986654924,\n \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.024398192986654924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177788,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177788\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.19183673469387755,\n \"acc_stderr\": 0.025206963154225395,\n \"acc_norm\": 0.19183673469387755,\n \"acc_norm_stderr\": 0.025206963154225395\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.30994152046783624,\n \"acc_stderr\": 0.03546976959393163,\n \"acc_norm\": 0.30994152046783624,\n \"acc_norm_stderr\": 0.03546976959393163\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n \"mc1_stderr\": 0.015368841620766365,\n \"mc2\": 0.49601663232017196,\n \"mc2_stderr\": 0.01564731250181349\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5185477505919495,\n \"acc_stderr\": 0.014042813708888378\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Locutusque/TinyMistral-248M-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|arc:challenge|25_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|gsm8k|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hellaswag|10_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T20-59-32.750418.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["**/details_harness|winogrande|5_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-07T20-59-32.750418.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T20_59_32.750418", "path": ["results_2024-01-07T20-59-32.750418.parquet"]}, {"split": "latest", "path": ["results_2024-01-07T20-59-32.750418.parquet"]}]}]} | 2024-01-07T21:02:11+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Locutusque/TinyMistral-248M-v2
Dataset automatically created during the evaluation run of model Locutusque/TinyMistral-248M-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-07T20:59:32.750418(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Locutusque/TinyMistral-248M-v2\n\n\n\nDataset automatically created during the evaluation run of model Locutusque/TinyMistral-248M-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T20:59:32.750418(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Locutusque/TinyMistral-248M-v2\n\n\n\nDataset automatically created during the evaluation run of model Locutusque/TinyMistral-248M-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T20:59:32.750418(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Locutusque/TinyMistral-248M-v2\n\n\n\nDataset automatically created during the evaluation run of model Locutusque/TinyMistral-248M-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-07T20:59:32.750418(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
5930a89955f6b78b159af7653730fecb1b5e4ba0 | # Phi-2 Rejection Sampling
The Phi-2 Rejection Sampling dataset is an English-language dataset consisting of 10 prompts and responses generated by [Phi-2](https://huggingface.co/microsoft/phi-2) and graded by the [OpenAssistant's reward model](https://huggingface.co/OpenAssistant/reward-model-deberta-v3-large-v2).
## Dataset Details
### Dataset Description
The Phi-2 Rejection Sampling dataset is a small (n = 10) English-language dataset. This dataset was created with the purpose was to demonstrate a feedback pipeline where in which Phi-2 would interact with the OpenAssistant reward model to discover "good" responses to given prompts. For more information about how this dataset is intended to be used and how it was created, please refer to their respective sections.
- **Curated by:** Tanush Chopra
- **Language(s) (NLP):** English
- **License:** MIT License
## Uses
This is intended to be used to fine-tuning and improving Phi-2's responses.
### Direct Use
As stated above, this is intended to be used to fine-tune Phi-2 and improve its responses for the use cases captured by the selected prompts.
### Out-of-Scope Use
While this could potentially be used for other LLMs, caution should be observed as each LLM is different and has different strengths and weaknesses. Given that, if you were to use the dataset to finetune other LLMs you may observe unintended behavior and negative impacts on the responses post-finetuning.
## Dataset Structure
There are only 2 fields to observe in this dataset: prompt and response
- prompt: string
- response: string
They are in a .tsv format, meaning you can expect them structured as such:
PROMPT\tRESPONSE \
{PROMPT_NUMBER_1}\t{RESPONSE_NUMBER_1} \
... \
{PROMPT_NUMBER_10}\t{RESPONSE_NUMBER_10}
You can find the dataset under dataset.tsv.
## Dataset Creation
### Curation Rationale
The dataset was curated with the intent of compiling a set of prompt-response pairs that could then be used to fine-tune Phi-2, improving its responses to similar such prompts and styles of prompts.
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
Prompts were curated by Tanush Chopra by looking over various prompts, styles, and domains used in datasets (e.g. MMLU, HellaSwag), various papers on ArXiv on prompting styles (e.g. Chain of Thought) and prompt repositories (e.g. FlowGPT, [repo](https://github.com/f/awesome-chatgpt-prompts)).\
Responses were generated and evaluated using Phi-2 and OpenAssistant's reward model.
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
Prompts were curated with 3 main areas in mind:
- Domain Usage (Arithmetic, Role-Playing, Commonsense, Conversation, Task/Planning)
- Prompting Style (Chain Of Thought, ELI5)
- Alignment (Hallucination, Bias and Toxicity, Power-Seeking, Moderation)
Each of these areas were kept in mind with the crafting of each prompt in order to ensure a representative usage of how Phi-2 or other similar LLMs may be used or misused. Do keep in mind that this list of prompts is nowhere near comprehensive, however, it is a good start.\
After prompts were crafted, a hyperparameter search for the optimal temperature (according to the average reward value) was found for the prompts with Phi-2 and OpenAssistant's reward model. The optimal temperature was found to be 0.4.\
After this was completed, Phi-2 was then prompted 8 times on each prompt to get 8 unique responses. Each of these responses were then "graded" by the reward model and the "best" response (according to the reward model) was saved in the dataset you see for potential fine-tuning.
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
Tanush Chopra
- Ethnicity: Indian
- Nationality: American
- Age: 18
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
By nature of this activity (for an interview), the number of prompts is limited to 10. While the prompts have been carefully selected, they are nowhere near comprehensive and, as such, are limited with the potential scope of prompts and responses it can affect. I will revisit this at a later time to make this dataset more comprehensive. \
In addition, it should be noted that due to this using a reward model instead of human feedback, the results of the responses may not be what "should" be enforced. For instance, if the model hallucinating is not caught by the reward model this could potentially be reinforced in the model, which is not good to say the least.
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
If you wish to execute a similar feedback pipeline, I would suggest scoring each response with multiple diverse and domain-independent comprehensive reward models instead of just one in order to mitigate potential biases in datasets used for training the reward models. In addition, using a vector DB of facts would be helpful to ensure factual accuracy.
## Dataset Card Authors [optional]
Tanush Chopra
## Dataset Card Contact
[email protected] | BluefinTuna/phi2_rejection_sampling | [
"task_categories:question-answering",
"size_categories:n<1K",
"language:en",
"license:mit",
"rejection-sampling",
"phi-2",
"question-answering",
"region:us"
] | 2024-01-07T21:19:49+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["n<1K"], "task_categories": ["question-answering"], "pretty_name": "Rejection Sampling on Phi-2", "tags": ["rejection-sampling", "phi-2", "question-answering"]} | 2024-01-09T18:18:57+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #size_categories-n<1K #language-English #license-mit #rejection-sampling #phi-2 #question-answering #region-us
| # Phi-2 Rejection Sampling
The Phi-2 Rejection Sampling dataset is an English-language dataset consisting of 10 prompts and responses generated by Phi-2 and graded by the OpenAssistant's reward model.
## Dataset Details
### Dataset Description
The Phi-2 Rejection Sampling dataset is a small (n = 10) English-language dataset. This dataset was created with the purpose was to demonstrate a feedback pipeline where in which Phi-2 would interact with the OpenAssistant reward model to discover "good" responses to given prompts. For more information about how this dataset is intended to be used and how it was created, please refer to their respective sections.
- Curated by: Tanush Chopra
- Language(s) (NLP): English
- License: MIT License
## Uses
This is intended to be used to fine-tuning and improving Phi-2's responses.
### Direct Use
As stated above, this is intended to be used to fine-tune Phi-2 and improve its responses for the use cases captured by the selected prompts.
### Out-of-Scope Use
While this could potentially be used for other LLMs, caution should be observed as each LLM is different and has different strengths and weaknesses. Given that, if you were to use the dataset to finetune other LLMs you may observe unintended behavior and negative impacts on the responses post-finetuning.
## Dataset Structure
There are only 2 fields to observe in this dataset: prompt and response
- prompt: string
- response: string
They are in a .tsv format, meaning you can expect them structured as such:
PROMPT\tRESPONSE \
{PROMPT_NUMBER_1}\t{RESPONSE_NUMBER_1} \
... \
{PROMPT_NUMBER_10}\t{RESPONSE_NUMBER_10}
You can find the dataset under URL.
## Dataset Creation
### Curation Rationale
The dataset was curated with the intent of compiling a set of prompt-response pairs that could then be used to fine-tune Phi-2, improving its responses to similar such prompts and styles of prompts.
### Source Data
Prompts were curated by Tanush Chopra by looking over various prompts, styles, and domains used in datasets (e.g. MMLU, HellaSwag), various papers on ArXiv on prompting styles (e.g. Chain of Thought) and prompt repositories (e.g. FlowGPT, repo).\
Responses were generated and evaluated using Phi-2 and OpenAssistant's reward model.
#### Data Collection and Processing
Prompts were curated with 3 main areas in mind:
- Domain Usage (Arithmetic, Role-Playing, Commonsense, Conversation, Task/Planning)
- Prompting Style (Chain Of Thought, ELI5)
- Alignment (Hallucination, Bias and Toxicity, Power-Seeking, Moderation)
Each of these areas were kept in mind with the crafting of each prompt in order to ensure a representative usage of how Phi-2 or other similar LLMs may be used or misused. Do keep in mind that this list of prompts is nowhere near comprehensive, however, it is a good start.\
After prompts were crafted, a hyperparameter search for the optimal temperature (according to the average reward value) was found for the prompts with Phi-2 and OpenAssistant's reward model. The optimal temperature was found to be 0.4.\
After this was completed, Phi-2 was then prompted 8 times on each prompt to get 8 unique responses. Each of these responses were then "graded" by the reward model and the "best" response (according to the reward model) was saved in the dataset you see for potential fine-tuning.
#### Who are the source data producers?
Tanush Chopra
- Ethnicity: Indian
- Nationality: American
- Age: 18
## Bias, Risks, and Limitations
By nature of this activity (for an interview), the number of prompts is limited to 10. While the prompts have been carefully selected, they are nowhere near comprehensive and, as such, are limited with the potential scope of prompts and responses it can affect. I will revisit this at a later time to make this dataset more comprehensive. \
In addition, it should be noted that due to this using a reward model instead of human feedback, the results of the responses may not be what "should" be enforced. For instance, if the model hallucinating is not caught by the reward model this could potentially be reinforced in the model, which is not good to say the least.
### Recommendations
If you wish to execute a similar feedback pipeline, I would suggest scoring each response with multiple diverse and domain-independent comprehensive reward models instead of just one in order to mitigate potential biases in datasets used for training the reward models. In addition, using a vector DB of facts would be helpful to ensure factual accuracy.
## Dataset Card Authors [optional]
Tanush Chopra
## Dataset Card Contact
tanushchop@URL | [
"# Phi-2 Rejection Sampling\n\nThe Phi-2 Rejection Sampling dataset is an English-language dataset consisting of 10 prompts and responses generated by Phi-2 and graded by the OpenAssistant's reward model.",
"## Dataset Details",
"### Dataset Description\n\nThe Phi-2 Rejection Sampling dataset is a small (n = 10) English-language dataset. This dataset was created with the purpose was to demonstrate a feedback pipeline where in which Phi-2 would interact with the OpenAssistant reward model to discover \"good\" responses to given prompts. For more information about how this dataset is intended to be used and how it was created, please refer to their respective sections.\n\n- Curated by: Tanush Chopra\n- Language(s) (NLP): English\n- License: MIT License",
"## Uses\n\nThis is intended to be used to fine-tuning and improving Phi-2's responses.",
"### Direct Use\n\nAs stated above, this is intended to be used to fine-tune Phi-2 and improve its responses for the use cases captured by the selected prompts.",
"### Out-of-Scope Use\n\nWhile this could potentially be used for other LLMs, caution should be observed as each LLM is different and has different strengths and weaknesses. Given that, if you were to use the dataset to finetune other LLMs you may observe unintended behavior and negative impacts on the responses post-finetuning.",
"## Dataset Structure\n\nThere are only 2 fields to observe in this dataset: prompt and response\n- prompt: string\n- response: string\n\nThey are in a .tsv format, meaning you can expect them structured as such:\n\nPROMPT\\tRESPONSE \\\n{PROMPT_NUMBER_1}\\t{RESPONSE_NUMBER_1} \\\n... \\\n{PROMPT_NUMBER_10}\\t{RESPONSE_NUMBER_10}\n\nYou can find the dataset under URL.",
"## Dataset Creation",
"### Curation Rationale\n\nThe dataset was curated with the intent of compiling a set of prompt-response pairs that could then be used to fine-tune Phi-2, improving its responses to similar such prompts and styles of prompts.",
"### Source Data\n\n\n\nPrompts were curated by Tanush Chopra by looking over various prompts, styles, and domains used in datasets (e.g. MMLU, HellaSwag), various papers on ArXiv on prompting styles (e.g. Chain of Thought) and prompt repositories (e.g. FlowGPT, repo).\\\nResponses were generated and evaluated using Phi-2 and OpenAssistant's reward model.",
"#### Data Collection and Processing\n\n\n\nPrompts were curated with 3 main areas in mind:\n- Domain Usage (Arithmetic, Role-Playing, Commonsense, Conversation, Task/Planning)\n- Prompting Style (Chain Of Thought, ELI5)\n- Alignment (Hallucination, Bias and Toxicity, Power-Seeking, Moderation)\n\nEach of these areas were kept in mind with the crafting of each prompt in order to ensure a representative usage of how Phi-2 or other similar LLMs may be used or misused. Do keep in mind that this list of prompts is nowhere near comprehensive, however, it is a good start.\\\nAfter prompts were crafted, a hyperparameter search for the optimal temperature (according to the average reward value) was found for the prompts with Phi-2 and OpenAssistant's reward model. The optimal temperature was found to be 0.4.\\\nAfter this was completed, Phi-2 was then prompted 8 times on each prompt to get 8 unique responses. Each of these responses were then \"graded\" by the reward model and the \"best\" response (according to the reward model) was saved in the dataset you see for potential fine-tuning.",
"#### Who are the source data producers?\n\n\n\nTanush Chopra\n- Ethnicity: Indian\n- Nationality: American\n- Age: 18",
"## Bias, Risks, and Limitations\n\n\n\nBy nature of this activity (for an interview), the number of prompts is limited to 10. While the prompts have been carefully selected, they are nowhere near comprehensive and, as such, are limited with the potential scope of prompts and responses it can affect. I will revisit this at a later time to make this dataset more comprehensive. \\\nIn addition, it should be noted that due to this using a reward model instead of human feedback, the results of the responses may not be what \"should\" be enforced. For instance, if the model hallucinating is not caught by the reward model this could potentially be reinforced in the model, which is not good to say the least.",
"### Recommendations\n\n\n\nIf you wish to execute a similar feedback pipeline, I would suggest scoring each response with multiple diverse and domain-independent comprehensive reward models instead of just one in order to mitigate potential biases in datasets used for training the reward models. In addition, using a vector DB of facts would be helpful to ensure factual accuracy.",
"## Dataset Card Authors [optional]\n\nTanush Chopra",
"## Dataset Card Contact\n\ntanushchop@URL"
] | [
"TAGS\n#task_categories-question-answering #size_categories-n<1K #language-English #license-mit #rejection-sampling #phi-2 #question-answering #region-us \n",
"# Phi-2 Rejection Sampling\n\nThe Phi-2 Rejection Sampling dataset is an English-language dataset consisting of 10 prompts and responses generated by Phi-2 and graded by the OpenAssistant's reward model.",
"## Dataset Details",
"### Dataset Description\n\nThe Phi-2 Rejection Sampling dataset is a small (n = 10) English-language dataset. This dataset was created with the purpose was to demonstrate a feedback pipeline where in which Phi-2 would interact with the OpenAssistant reward model to discover \"good\" responses to given prompts. For more information about how this dataset is intended to be used and how it was created, please refer to their respective sections.\n\n- Curated by: Tanush Chopra\n- Language(s) (NLP): English\n- License: MIT License",
"## Uses\n\nThis is intended to be used to fine-tuning and improving Phi-2's responses.",
"### Direct Use\n\nAs stated above, this is intended to be used to fine-tune Phi-2 and improve its responses for the use cases captured by the selected prompts.",
"### Out-of-Scope Use\n\nWhile this could potentially be used for other LLMs, caution should be observed as each LLM is different and has different strengths and weaknesses. Given that, if you were to use the dataset to finetune other LLMs you may observe unintended behavior and negative impacts on the responses post-finetuning.",
"## Dataset Structure\n\nThere are only 2 fields to observe in this dataset: prompt and response\n- prompt: string\n- response: string\n\nThey are in a .tsv format, meaning you can expect them structured as such:\n\nPROMPT\\tRESPONSE \\\n{PROMPT_NUMBER_1}\\t{RESPONSE_NUMBER_1} \\\n... \\\n{PROMPT_NUMBER_10}\\t{RESPONSE_NUMBER_10}\n\nYou can find the dataset under URL.",
"## Dataset Creation",
"### Curation Rationale\n\nThe dataset was curated with the intent of compiling a set of prompt-response pairs that could then be used to fine-tune Phi-2, improving its responses to similar such prompts and styles of prompts.",
"### Source Data\n\n\n\nPrompts were curated by Tanush Chopra by looking over various prompts, styles, and domains used in datasets (e.g. MMLU, HellaSwag), various papers on ArXiv on prompting styles (e.g. Chain of Thought) and prompt repositories (e.g. FlowGPT, repo).\\\nResponses were generated and evaluated using Phi-2 and OpenAssistant's reward model.",
"#### Data Collection and Processing\n\n\n\nPrompts were curated with 3 main areas in mind:\n- Domain Usage (Arithmetic, Role-Playing, Commonsense, Conversation, Task/Planning)\n- Prompting Style (Chain Of Thought, ELI5)\n- Alignment (Hallucination, Bias and Toxicity, Power-Seeking, Moderation)\n\nEach of these areas were kept in mind with the crafting of each prompt in order to ensure a representative usage of how Phi-2 or other similar LLMs may be used or misused. Do keep in mind that this list of prompts is nowhere near comprehensive, however, it is a good start.\\\nAfter prompts were crafted, a hyperparameter search for the optimal temperature (according to the average reward value) was found for the prompts with Phi-2 and OpenAssistant's reward model. The optimal temperature was found to be 0.4.\\\nAfter this was completed, Phi-2 was then prompted 8 times on each prompt to get 8 unique responses. Each of these responses were then \"graded\" by the reward model and the \"best\" response (according to the reward model) was saved in the dataset you see for potential fine-tuning.",
"#### Who are the source data producers?\n\n\n\nTanush Chopra\n- Ethnicity: Indian\n- Nationality: American\n- Age: 18",
"## Bias, Risks, and Limitations\n\n\n\nBy nature of this activity (for an interview), the number of prompts is limited to 10. While the prompts have been carefully selected, they are nowhere near comprehensive and, as such, are limited with the potential scope of prompts and responses it can affect. I will revisit this at a later time to make this dataset more comprehensive. \\\nIn addition, it should be noted that due to this using a reward model instead of human feedback, the results of the responses may not be what \"should\" be enforced. For instance, if the model hallucinating is not caught by the reward model this could potentially be reinforced in the model, which is not good to say the least.",
"### Recommendations\n\n\n\nIf you wish to execute a similar feedback pipeline, I would suggest scoring each response with multiple diverse and domain-independent comprehensive reward models instead of just one in order to mitigate potential biases in datasets used for training the reward models. In addition, using a vector DB of facts would be helpful to ensure factual accuracy.",
"## Dataset Card Authors [optional]\n\nTanush Chopra",
"## Dataset Card Contact\n\ntanushchop@URL"
] | [
53,
55,
4,
124,
23,
38,
85,
116,
5,
56,
108,
277,
29,
162,
83,
14,
10
] | [
"passage: TAGS\n#task_categories-question-answering #size_categories-n<1K #language-English #license-mit #rejection-sampling #phi-2 #question-answering #region-us \n# Phi-2 Rejection Sampling\n\nThe Phi-2 Rejection Sampling dataset is an English-language dataset consisting of 10 prompts and responses generated by Phi-2 and graded by the OpenAssistant's reward model.## Dataset Details### Dataset Description\n\nThe Phi-2 Rejection Sampling dataset is a small (n = 10) English-language dataset. This dataset was created with the purpose was to demonstrate a feedback pipeline where in which Phi-2 would interact with the OpenAssistant reward model to discover \"good\" responses to given prompts. For more information about how this dataset is intended to be used and how it was created, please refer to their respective sections.\n\n- Curated by: Tanush Chopra\n- Language(s) (NLP): English\n- License: MIT License## Uses\n\nThis is intended to be used to fine-tuning and improving Phi-2's responses.### Direct Use\n\nAs stated above, this is intended to be used to fine-tune Phi-2 and improve its responses for the use cases captured by the selected prompts.### Out-of-Scope Use\n\nWhile this could potentially be used for other LLMs, caution should be observed as each LLM is different and has different strengths and weaknesses. Given that, if you were to use the dataset to finetune other LLMs you may observe unintended behavior and negative impacts on the responses post-finetuning.## Dataset Structure\n\nThere are only 2 fields to observe in this dataset: prompt and response\n- prompt: string\n- response: string\n\nThey are in a .tsv format, meaning you can expect them structured as such:\n\nPROMPT\\tRESPONSE \\\n{PROMPT_NUMBER_1}\\t{RESPONSE_NUMBER_1} \\\n... \\\n{PROMPT_NUMBER_10}\\t{RESPONSE_NUMBER_10}\n\nYou can find the dataset under URL.## Dataset Creation",
"passage: ### Curation Rationale\n\nThe dataset was curated with the intent of compiling a set of prompt-response pairs that could then be used to fine-tune Phi-2, improving its responses to similar such prompts and styles of prompts.### Source Data\n\n\n\nPrompts were curated by Tanush Chopra by looking over various prompts, styles, and domains used in datasets (e.g. MMLU, HellaSwag), various papers on ArXiv on prompting styles (e.g. Chain of Thought) and prompt repositories (e.g. FlowGPT, repo).\\\nResponses were generated and evaluated using Phi-2 and OpenAssistant's reward model.#### Data Collection and Processing\n\n\n\nPrompts were curated with 3 main areas in mind:\n- Domain Usage (Arithmetic, Role-Playing, Commonsense, Conversation, Task/Planning)\n- Prompting Style (Chain Of Thought, ELI5)\n- Alignment (Hallucination, Bias and Toxicity, Power-Seeking, Moderation)\n\nEach of these areas were kept in mind with the crafting of each prompt in order to ensure a representative usage of how Phi-2 or other similar LLMs may be used or misused. Do keep in mind that this list of prompts is nowhere near comprehensive, however, it is a good start.\\\nAfter prompts were crafted, a hyperparameter search for the optimal temperature (according to the average reward value) was found for the prompts with Phi-2 and OpenAssistant's reward model. The optimal temperature was found to be 0.4.\\\nAfter this was completed, Phi-2 was then prompted 8 times on each prompt to get 8 unique responses. Each of these responses were then \"graded\" by the reward model and the \"best\" response (according to the reward model) was saved in the dataset you see for potential fine-tuning.#### Who are the source data producers?\n\n\n\nTanush Chopra\n- Ethnicity: Indian\n- Nationality: American\n- Age: 18"
] |
c36b4e26b62671387052e14f887e910ea57bcf28 |
# Dataset Card for Evaluation run of xaviviro/FLOR-1.3B-xat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [xaviviro/FLOR-1.3B-xat](https://huggingface.co/xaviviro/FLOR-1.3B-xat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xaviviro__FLOR-1.3B-xat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-07T21:25:58.106311](https://huggingface.co/datasets/open-llm-leaderboard/details_xaviviro__FLOR-1.3B-xat/blob/main/results_2024-01-07T21-25-58.106311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2672510124783361,
"acc_stderr": 0.031212215858179283,
"acc_norm": 0.26904827694943856,
"acc_norm_stderr": 0.03200707616382038,
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.44375566520951115,
"mc2_stderr": 0.014968548556287192
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132865,
"acc_norm": 0.26791808873720135,
"acc_norm_stderr": 0.01294203019513643
},
"harness|hellaswag|10": {
"acc": 0.3437562238597889,
"acc_stderr": 0.004739902411944556,
"acc_norm": 0.4162517426807409,
"acc_norm_stderr": 0.004919289113027516
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.03455473702325436,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03455473702325436
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3026315789473684,
"acc_stderr": 0.03738520676119667,
"acc_norm": 0.3026315789473684,
"acc_norm_stderr": 0.03738520676119667
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.28679245283018867,
"acc_stderr": 0.02783491252754409,
"acc_norm": 0.28679245283018867,
"acc_norm_stderr": 0.02783491252754409
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2847222222222222,
"acc_stderr": 0.03773809990686936,
"acc_norm": 0.2847222222222222,
"acc_norm_stderr": 0.03773809990686936
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.22127659574468084,
"acc_stderr": 0.027136349602424052,
"acc_norm": 0.22127659574468084,
"acc_norm_stderr": 0.027136349602424052
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481404,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481404
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.040406101782088394,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.040406101782088394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.26129032258064516,
"acc_stderr": 0.024993053397764822,
"acc_norm": 0.26129032258064516,
"acc_norm_stderr": 0.024993053397764822
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.032550867699701024,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.032550867699701024
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3641025641025641,
"acc_stderr": 0.02439667298509477,
"acc_norm": 0.3641025641025641,
"acc_norm_stderr": 0.02439667298509477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073835,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073835
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.02702543349888236,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.02702543349888236
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3431192660550459,
"acc_stderr": 0.02035477773608604,
"acc_norm": 0.3431192660550459,
"acc_norm_stderr": 0.02035477773608604
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03005820270430985,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03005820270430985
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.21940928270042195,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.21940928270042195,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.18834080717488788,
"acc_stderr": 0.026241132996407273,
"acc_norm": 0.18834080717488788,
"acc_norm_stderr": 0.026241132996407273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.32061068702290074,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.32061068702290074,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3305785123966942,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.3305785123966942,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.1901840490797546,
"acc_stderr": 0.030833491146281235,
"acc_norm": 0.1901840490797546,
"acc_norm_stderr": 0.030833491146281235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.044642857142857116,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.044642857142857116
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260597,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260597
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749465,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749465
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.015769984840690518,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.015769984840690518
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3104575163398693,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.3104575163398693,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.025311765975426115,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.025311765975426115
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445803,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445803
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590638,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2516297262059974,
"acc_stderr": 0.011083276280441904,
"acc_norm": 0.2516297262059974,
"acc_norm_stderr": 0.011083276280441904
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.34558823529411764,
"acc_stderr": 0.028888193103988644,
"acc_norm": 0.34558823529411764,
"acc_norm_stderr": 0.028888193103988644
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.01755581809132226,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.01755581809132226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.028920583220675596,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.028920583220675596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.03115715086935555,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.03115715086935555
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21686746987951808,
"acc_stderr": 0.03208284450356365,
"acc_norm": 0.21686746987951808,
"acc_norm_stderr": 0.03208284450356365
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.03301405946987249,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.03301405946987249
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.44375566520951115,
"mc2_stderr": 0.014968548556287192
},
"harness|winogrande|5": {
"acc": 0.5343330702446725,
"acc_stderr": 0.01401931753154257
},
"harness|gsm8k|5": {
"acc": 0.0075815011372251705,
"acc_stderr": 0.002389281512077213
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_xaviviro__FLOR-1.3B-xat | [
"region:us"
] | 2024-01-07T21:28:32+00:00 | {"pretty_name": "Evaluation run of xaviviro/FLOR-1.3B-xat", "dataset_summary": "Dataset automatically created during the evaluation run of model [xaviviro/FLOR-1.3B-xat](https://huggingface.co/xaviviro/FLOR-1.3B-xat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xaviviro__FLOR-1.3B-xat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-07T21:25:58.106311](https://huggingface.co/datasets/open-llm-leaderboard/details_xaviviro__FLOR-1.3B-xat/blob/main/results_2024-01-07T21-25-58.106311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2672510124783361,\n \"acc_stderr\": 0.031212215858179283,\n \"acc_norm\": 0.26904827694943856,\n \"acc_norm_stderr\": 0.03200707616382038,\n \"mc1\": 0.27539779681762544,\n \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.44375566520951115,\n \"mc2_stderr\": 0.014968548556287192\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n \"acc_stderr\": 0.012240491536132865,\n \"acc_norm\": 0.26791808873720135,\n \"acc_norm_stderr\": 0.01294203019513643\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3437562238597889,\n \"acc_stderr\": 0.004739902411944556,\n \"acc_norm\": 0.4162517426807409,\n \"acc_norm_stderr\": 0.004919289113027516\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03455473702325436,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03455473702325436\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3026315789473684,\n \"acc_stderr\": 0.03738520676119667,\n \"acc_norm\": 0.3026315789473684,\n \"acc_norm_stderr\": 0.03738520676119667\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.28679245283018867,\n \"acc_stderr\": 0.02783491252754409,\n \"acc_norm\": 0.28679245283018867,\n \"acc_norm_stderr\": 0.02783491252754409\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2847222222222222,\n \"acc_stderr\": 0.03773809990686936,\n \"acc_norm\": 0.2847222222222222,\n \"acc_norm_stderr\": 0.03773809990686936\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.27167630057803466,\n \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.27167630057803466,\n \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.22127659574468084,\n \"acc_stderr\": 0.027136349602424052,\n \"acc_norm\": 0.22127659574468084,\n \"acc_norm_stderr\": 0.027136349602424052\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.040493392977481404,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.040493392977481404\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.040406101782088394,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.040406101782088394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.26129032258064516,\n \"acc_stderr\": 0.024993053397764822,\n \"acc_norm\": 0.26129032258064516,\n \"acc_norm_stderr\": 0.024993053397764822\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.032550867699701024,\n \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.032550867699701024\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3641025641025641,\n \"acc_stderr\": 0.02439667298509477,\n \"acc_norm\": 0.3641025641025641,\n \"acc_norm_stderr\": 0.02439667298509477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073835,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073835\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.02702543349888236,\n \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.02702543349888236\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3431192660550459,\n \"acc_stderr\": 0.02035477773608604,\n \"acc_norm\": 0.3431192660550459,\n \"acc_norm_stderr\": 0.02035477773608604\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03005820270430985,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03005820270430985\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.21940928270042195,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.21940928270042195,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.18834080717488788,\n \"acc_stderr\": 0.026241132996407273,\n \"acc_norm\": 0.18834080717488788,\n \"acc_norm_stderr\": 0.026241132996407273\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.32061068702290074,\n \"acc_stderr\": 0.04093329229834278,\n \"acc_norm\": 0.32061068702290074,\n \"acc_norm_stderr\": 0.04093329229834278\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212094,\n \"acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.1901840490797546,\n \"acc_stderr\": 0.030833491146281235,\n \"acc_norm\": 0.1901840490797546,\n \"acc_norm_stderr\": 0.030833491146281235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n \"acc_stderr\": 0.044642857142857116,\n \"acc_norm\": 0.33035714285714285,\n \"acc_norm_stderr\": 0.044642857142857116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260597,\n \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260597\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n \"acc_stderr\": 0.028911208802749465,\n \"acc_norm\": 0.26495726495726496,\n \"acc_norm_stderr\": 0.028911208802749465\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n \"acc_stderr\": 0.015769984840690518,\n \"acc_norm\": 0.26436781609195403,\n \"acc_norm_stderr\": 0.015769984840690518\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3104575163398693,\n \"acc_stderr\": 0.026493033225145898,\n \"acc_norm\": 0.3104575163398693,\n \"acc_norm_stderr\": 0.026493033225145898\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n \"acc_stderr\": 0.025311765975426115,\n \"acc_norm\": 0.2733118971061093,\n \"acc_norm_stderr\": 0.025311765975426115\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445803,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445803\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590638,\n \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590638\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2516297262059974,\n \"acc_stderr\": 0.011083276280441904,\n \"acc_norm\": 0.2516297262059974,\n \"acc_norm_stderr\": 0.011083276280441904\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.34558823529411764,\n \"acc_stderr\": 0.028888193103988644,\n \"acc_norm\": 0.34558823529411764,\n \"acc_norm_stderr\": 0.028888193103988644\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.01755581809132226,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.01755581809132226\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.028920583220675596,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.028920583220675596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n \"acc_stderr\": 0.03115715086935555,\n \"acc_norm\": 0.263681592039801,\n \"acc_norm_stderr\": 0.03115715086935555\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.03301405946987249,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.03301405946987249\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27539779681762544,\n \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.44375566520951115,\n \"mc2_stderr\": 0.014968548556287192\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5343330702446725,\n \"acc_stderr\": 0.01401931753154257\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0075815011372251705,\n \"acc_stderr\": 0.002389281512077213\n }\n}\n```", "repo_url": "https://huggingface.co/xaviviro/FLOR-1.3B-xat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|arc:challenge|25_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|gsm8k|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hellaswag|10_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T21-25-58.106311.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["**/details_harness|winogrande|5_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-07T21-25-58.106311.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T21_25_58.106311", "path": ["results_2024-01-07T21-25-58.106311.parquet"]}, {"split": "latest", "path": ["results_2024-01-07T21-25-58.106311.parquet"]}]}]} | 2024-01-07T21:28:53+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of xaviviro/FLOR-1.3B-xat
Dataset automatically created during the evaluation run of model xaviviro/FLOR-1.3B-xat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-07T21:25:58.106311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of xaviviro/FLOR-1.3B-xat\n\n\n\nDataset automatically created during the evaluation run of model xaviviro/FLOR-1.3B-xat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T21:25:58.106311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of xaviviro/FLOR-1.3B-xat\n\n\n\nDataset automatically created during the evaluation run of model xaviviro/FLOR-1.3B-xat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T21:25:58.106311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of xaviviro/FLOR-1.3B-xat\n\n\n\nDataset automatically created during the evaluation run of model xaviviro/FLOR-1.3B-xat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-07T21:25:58.106311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
9f1451f3bde31038c199bc78dbdca4bd1b46c900 |
# Dataset Card for Evaluation run of jilp00/Hermes-2-SOLAR-10.7B-Symbolic
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jilp00/Hermes-2-SOLAR-10.7B-Symbolic](https://huggingface.co/jilp00/Hermes-2-SOLAR-10.7B-Symbolic) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jilp00__Hermes-2-SOLAR-10.7B-Symbolic",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-07T21:33:05.098650](https://huggingface.co/datasets/open-llm-leaderboard/details_jilp00__Hermes-2-SOLAR-10.7B-Symbolic/blob/main/results_2024-01-07T21-33-05.098650.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6426233455922059,
"acc_stderr": 0.03178345115211833,
"acc_norm": 0.6530283252823872,
"acc_norm_stderr": 0.0324878445417635,
"mc1": 0.3561811505507956,
"mc1_stderr": 0.01676379072844634,
"mc2": 0.5484982867197952,
"mc2_stderr": 0.015225517770683289
},
"harness|arc:challenge|25": {
"acc": 0.5639931740614335,
"acc_stderr": 0.014491225699230914,
"acc_norm": 0.6168941979522184,
"acc_norm_stderr": 0.014206472661672874
},
"harness|hellaswag|10": {
"acc": 0.6030671181039634,
"acc_stderr": 0.004882619484166602,
"acc_norm": 0.8257319259111731,
"acc_norm_stderr": 0.0037856457412359383
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.025722097064388542,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.025722097064388542
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026552207828215282,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026552207828215282
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887034,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887034
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501534,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291954,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980981,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980981
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4748603351955307,
"acc_stderr": 0.016701350842682632,
"acc_norm": 0.4748603351955307,
"acc_norm_stderr": 0.016701350842682632
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179622,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179622
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700855,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700855
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48239895697522817,
"acc_stderr": 0.012762321298823643,
"acc_norm": 0.48239895697522817,
"acc_norm_stderr": 0.012762321298823643
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.01909422816700032,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.01909422816700032
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960238,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960238
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306042,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3561811505507956,
"mc1_stderr": 0.01676379072844634,
"mc2": 0.5484982867197952,
"mc2_stderr": 0.015225517770683289
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.011082538847491906
},
"harness|gsm8k|5": {
"acc": 0.13949962092494314,
"acc_stderr": 0.009543426687191282
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jilp00__Hermes-2-SOLAR-10.7B-Symbolic | [
"region:us"
] | 2024-01-07T21:35:23+00:00 | {"pretty_name": "Evaluation run of jilp00/Hermes-2-SOLAR-10.7B-Symbolic", "dataset_summary": "Dataset automatically created during the evaluation run of model [jilp00/Hermes-2-SOLAR-10.7B-Symbolic](https://huggingface.co/jilp00/Hermes-2-SOLAR-10.7B-Symbolic) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jilp00__Hermes-2-SOLAR-10.7B-Symbolic\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-07T21:33:05.098650](https://huggingface.co/datasets/open-llm-leaderboard/details_jilp00__Hermes-2-SOLAR-10.7B-Symbolic/blob/main/results_2024-01-07T21-33-05.098650.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6426233455922059,\n \"acc_stderr\": 0.03178345115211833,\n \"acc_norm\": 0.6530283252823872,\n \"acc_norm_stderr\": 0.0324878445417635,\n \"mc1\": 0.3561811505507956,\n \"mc1_stderr\": 0.01676379072844634,\n \"mc2\": 0.5484982867197952,\n \"mc2_stderr\": 0.015225517770683289\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5639931740614335,\n \"acc_stderr\": 0.014491225699230914,\n \"acc_norm\": 0.6168941979522184,\n \"acc_norm_stderr\": 0.014206472661672874\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6030671181039634,\n \"acc_stderr\": 0.004882619484166602,\n \"acc_norm\": 0.8257319259111731,\n \"acc_norm_stderr\": 0.0037856457412359383\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.025722097064388542,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.025722097064388542\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026552207828215282,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026552207828215282\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887034,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887034\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501534,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501534\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.030360379710291954,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.030360379710291954\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.03957835471980981,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.03957835471980981\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4748603351955307,\n \"acc_stderr\": 0.016701350842682632,\n \"acc_norm\": 0.4748603351955307,\n \"acc_norm_stderr\": 0.016701350842682632\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.025403832978179622,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.025403832978179622\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700855,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700855\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48239895697522817,\n \"acc_stderr\": 0.012762321298823643,\n \"acc_norm\": 0.48239895697522817,\n \"acc_norm_stderr\": 0.012762321298823643\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700032,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700032\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306042,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3561811505507956,\n \"mc1_stderr\": 0.01676379072844634,\n \"mc2\": 0.5484982867197952,\n \"mc2_stderr\": 0.015225517770683289\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.011082538847491906\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13949962092494314,\n \"acc_stderr\": 0.009543426687191282\n }\n}\n```", "repo_url": "https://huggingface.co/jilp00/Hermes-2-SOLAR-10.7B-Symbolic", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|arc:challenge|25_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|gsm8k|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hellaswag|10_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T21-33-05.098650.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["**/details_harness|winogrande|5_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-07T21-33-05.098650.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T21_33_05.098650", "path": ["results_2024-01-07T21-33-05.098650.parquet"]}, {"split": "latest", "path": ["results_2024-01-07T21-33-05.098650.parquet"]}]}]} | 2024-01-07T21:35:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jilp00/Hermes-2-SOLAR-10.7B-Symbolic
Dataset automatically created during the evaluation run of model jilp00/Hermes-2-SOLAR-10.7B-Symbolic on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-07T21:33:05.098650(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of jilp00/Hermes-2-SOLAR-10.7B-Symbolic\n\n\n\nDataset automatically created during the evaluation run of model jilp00/Hermes-2-SOLAR-10.7B-Symbolic on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T21:33:05.098650(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jilp00/Hermes-2-SOLAR-10.7B-Symbolic\n\n\n\nDataset automatically created during the evaluation run of model jilp00/Hermes-2-SOLAR-10.7B-Symbolic on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T21:33:05.098650(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
197,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jilp00/Hermes-2-SOLAR-10.7B-Symbolic\n\n\n\nDataset automatically created during the evaluation run of model jilp00/Hermes-2-SOLAR-10.7B-Symbolic on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-07T21:33:05.098650(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
dca4d5859802f411653900634c9af21d7a50ccd1 |
# Dataset Card for Evaluation run of bhavinjawade/SOLAR-10B-OrcaDPO-Jawade
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bhavinjawade/SOLAR-10B-OrcaDPO-Jawade](https://huggingface.co/bhavinjawade/SOLAR-10B-OrcaDPO-Jawade) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bhavinjawade__SOLAR-10B-OrcaDPO-Jawade",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T00:26:00.493175](https://huggingface.co/datasets/open-llm-leaderboard/details_bhavinjawade__SOLAR-10B-OrcaDPO-Jawade/blob/main/results_2024-01-08T00-26-00.493175.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6651208449600176,
"acc_stderr": 0.031664526015958,
"acc_norm": 0.6658468413864356,
"acc_norm_stderr": 0.03231099254455236,
"mc1": 0.5532435740514076,
"mc1_stderr": 0.017403977522557144,
"mc2": 0.7156900791514688,
"mc2_stderr": 0.01507120145651986
},
"harness|arc:challenge|25": {
"acc": 0.6851535836177475,
"acc_stderr": 0.01357265770308495,
"acc_norm": 0.71160409556314,
"acc_norm_stderr": 0.013238394422428175
},
"harness|hellaswag|10": {
"acc": 0.7117108145787692,
"acc_stderr": 0.004520406331084042,
"acc_norm": 0.882692690699064,
"acc_norm_stderr": 0.0032112847607016527
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03583496176361072,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03583496176361072
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6297872340425532,
"acc_stderr": 0.03156564682236785,
"acc_norm": 0.6297872340425532,
"acc_norm_stderr": 0.03156564682236785
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47354497354497355,
"acc_stderr": 0.025715239811346758,
"acc_norm": 0.47354497354497355,
"acc_norm_stderr": 0.025715239811346758
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.02366435940288023,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.02366435940288023
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335065,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335065
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.02921354941437217,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.02921354941437217
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092437,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092437
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.03381200005643526,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.03381200005643526
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553332,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553332
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.022535526352692705,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.022535526352692705
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597524,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597524
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39888268156424583,
"acc_stderr": 0.016376966142610076,
"acc_norm": 0.39888268156424583,
"acc_norm_stderr": 0.016376966142610076
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.024404394928087866,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.024404394928087866
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179615,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179615
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.023132376234543332,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.023132376234543332
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4954367666232073,
"acc_stderr": 0.012769704263117522,
"acc_norm": 0.4954367666232073,
"acc_norm_stderr": 0.012769704263117522
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.0265565194700415,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.0265565194700415
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.01866335967146366,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.01866335967146366
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399673,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399673
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5532435740514076,
"mc1_stderr": 0.017403977522557144,
"mc2": 0.7156900791514688,
"mc2_stderr": 0.01507120145651986
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273764
},
"harness|gsm8k|5": {
"acc": 0.6482183472327521,
"acc_stderr": 0.013153446023536042
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_bhavinjawade__SOLAR-10B-OrcaDPO-Jawade | [
"region:us"
] | 2024-01-07T21:37:05+00:00 | {"pretty_name": "Evaluation run of bhavinjawade/SOLAR-10B-OrcaDPO-Jawade", "dataset_summary": "Dataset automatically created during the evaluation run of model [bhavinjawade/SOLAR-10B-OrcaDPO-Jawade](https://huggingface.co/bhavinjawade/SOLAR-10B-OrcaDPO-Jawade) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bhavinjawade__SOLAR-10B-OrcaDPO-Jawade\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-08T00:26:00.493175](https://huggingface.co/datasets/open-llm-leaderboard/details_bhavinjawade__SOLAR-10B-OrcaDPO-Jawade/blob/main/results_2024-01-08T00-26-00.493175.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6651208449600176,\n \"acc_stderr\": 0.031664526015958,\n \"acc_norm\": 0.6658468413864356,\n \"acc_norm_stderr\": 0.03231099254455236,\n \"mc1\": 0.5532435740514076,\n \"mc1_stderr\": 0.017403977522557144,\n \"mc2\": 0.7156900791514688,\n \"mc2_stderr\": 0.01507120145651986\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6851535836177475,\n \"acc_stderr\": 0.01357265770308495,\n \"acc_norm\": 0.71160409556314,\n \"acc_norm_stderr\": 0.013238394422428175\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7117108145787692,\n \"acc_stderr\": 0.004520406331084042,\n \"acc_norm\": 0.882692690699064,\n \"acc_norm_stderr\": 0.0032112847607016527\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361072,\n \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361072\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6297872340425532,\n \"acc_stderr\": 0.03156564682236785,\n \"acc_norm\": 0.6297872340425532,\n \"acc_norm_stderr\": 0.03156564682236785\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47354497354497355,\n \"acc_stderr\": 0.025715239811346758,\n \"acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.025715239811346758\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8737373737373737,\n \"acc_stderr\": 0.02366435940288023,\n \"acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.02366435940288023\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335065,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335065\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.02921354941437217,\n \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.02921354941437217\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092437,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092437\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.03381200005643526,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.03381200005643526\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553332,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553332\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597524,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597524\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n \"acc_stderr\": 0.016376966142610076,\n \"acc_norm\": 0.39888268156424583,\n \"acc_norm_stderr\": 0.016376966142610076\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.024404394928087866,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.024404394928087866\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.025403832978179615,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.025403832978179615\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023132376234543332,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023132376234543332\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5283687943262412,\n \"acc_stderr\": 0.029779450957303055,\n \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.029779450957303055\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4954367666232073,\n \"acc_stderr\": 0.012769704263117522,\n \"acc_norm\": 0.4954367666232073,\n \"acc_norm_stderr\": 0.012769704263117522\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.0265565194700415,\n \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.0265565194700415\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.01866335967146366,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.01866335967146366\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5532435740514076,\n \"mc1_stderr\": 0.017403977522557144,\n \"mc2\": 0.7156900791514688,\n \"mc2_stderr\": 0.01507120145651986\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273764\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6482183472327521,\n \"acc_stderr\": 0.013153446023536042\n }\n}\n```", "repo_url": "https://huggingface.co/bhavinjawade/SOLAR-10B-OrcaDPO-Jawade", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|arc:challenge|25_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|arc:challenge|25_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|gsm8k|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|gsm8k|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hellaswag|10_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hellaswag|10_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T21-34-48.093271.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T00-26-00.493175.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["**/details_harness|winogrande|5_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["**/details_harness|winogrande|5_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-08T00-26-00.493175.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T21_34_48.093271", "path": ["results_2024-01-07T21-34-48.093271.parquet"]}, {"split": "2024_01_08T00_26_00.493175", "path": ["results_2024-01-08T00-26-00.493175.parquet"]}, {"split": "latest", "path": ["results_2024-01-08T00-26-00.493175.parquet"]}]}]} | 2024-01-08T00:28:21+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of bhavinjawade/SOLAR-10B-OrcaDPO-Jawade
Dataset automatically created during the evaluation run of model bhavinjawade/SOLAR-10B-OrcaDPO-Jawade on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-08T00:26:00.493175(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of bhavinjawade/SOLAR-10B-OrcaDPO-Jawade\n\n\n\nDataset automatically created during the evaluation run of model bhavinjawade/SOLAR-10B-OrcaDPO-Jawade on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T00:26:00.493175(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bhavinjawade/SOLAR-10B-OrcaDPO-Jawade\n\n\n\nDataset automatically created during the evaluation run of model bhavinjawade/SOLAR-10B-OrcaDPO-Jawade on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T00:26:00.493175(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
199,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bhavinjawade/SOLAR-10B-OrcaDPO-Jawade\n\n\n\nDataset automatically created during the evaluation run of model bhavinjawade/SOLAR-10B-OrcaDPO-Jawade on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-08T00:26:00.493175(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
ed232776ce5d736028c1da18e81ab7048969f369 |
This dataset is derived from the [GermanQuAD](https://www.deepset.ai/germanquad) dataset.
This dataset takes the testset and represents it as qrels in the [BEIR](https://github.com/beir-cellar/beir) information retrieval benchmark format.
Corpus and query ids have been added.
The corresponding corpus can be found [here](https://huggingface.co/datasets/mteb/germanquad-retrieval).
Full credit for the original dataset goes to the [authors](https://arxiv.org/abs/2104.12741) of the GermanQuAD [dataset](https://huggingface.co/datasets/deepset/germandpr).
The original dataset is licensed under [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/).
Citation for the original dataset:
```
@misc{möller2021germanquad,
title={GermanQuAD and GermanDPR: Improving Non-English Question Answering and Passage Retrieval},
author={Timo Möller and Julian Risch and Malte Pietsch},
year={2021},
eprint={2104.12741},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
The derived dataset was created by [rasdani](https://hf-proxy-cf.effarig.sitem/rasdani).
| mteb/germanquad-retrieval-qrels | [
"source_datasets:deepset/germanquad",
"language:de",
"license:cc-by-4.0",
"arxiv:2104.12741",
"region:us"
] | 2024-01-07T21:41:13+00:00 | {"language": ["de"], "license": "cc-by-4.0", "source_datasets": ["deepset/germanquad"], "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "test/data-00000-of-00001.arrow"}]}]} | 2024-01-08T17:49:49+00:00 | [
"2104.12741"
] | [
"de"
] | TAGS
#source_datasets-deepset/germanquad #language-German #license-cc-by-4.0 #arxiv-2104.12741 #region-us
|
This dataset is derived from the GermanQuAD dataset.
This dataset takes the testset and represents it as qrels in the BEIR information retrieval benchmark format.
Corpus and query ids have been added.
The corresponding corpus can be found here.
Full credit for the original dataset goes to the authors of the GermanQuAD dataset.
The original dataset is licensed under CC BY-SA 4.0.
Citation for the original dataset:
The derived dataset was created by rasdani.
| [] | [
"TAGS\n#source_datasets-deepset/germanquad #language-German #license-cc-by-4.0 #arxiv-2104.12741 #region-us \n"
] | [
40
] | [
"passage: TAGS\n#source_datasets-deepset/germanquad #language-German #license-cc-by-4.0 #arxiv-2104.12741 #region-us \n"
] |
609d6f7df39214d837bd5987e22db2a9bffe2b00 | model: https://huggingface.co/sentence-transformers/clip-ViT-B-32 # 605MB | teamtom/25000_word_emb_base | [
"license:apache-2.0",
"region:us"
] | 2024-01-07T21:49:21+00:00 | {"license": "apache-2.0"} | 2024-01-14T16:03:06+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| model: URL # 605MB | [
"# 605MB"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# 605MB"
] | [
14,
4
] | [
"passage: TAGS\n#license-apache-2.0 #region-us \n# 605MB"
] |
6de74ed8995876e31f7ed363d148af74e881f091 |
# Dataset Card for Evaluation run of AtAndDev/CapybaraMarcoroni-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AtAndDev/CapybaraMarcoroni-7B](https://huggingface.co/AtAndDev/CapybaraMarcoroni-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AtAndDev__CapybaraMarcoroni-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-07T21:50:49.600700](https://huggingface.co/datasets/open-llm-leaderboard/details_AtAndDev__CapybaraMarcoroni-7B/blob/main/results_2024-01-07T21-50-49.600700.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6545153984213022,
"acc_stderr": 0.03210949302385312,
"acc_norm": 0.6553202719640217,
"acc_norm_stderr": 0.03276998196560543,
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431455,
"mc2": 0.5706929434240026,
"mc2_stderr": 0.015037653624275078
},
"harness|arc:challenge|25": {
"acc": 0.6126279863481229,
"acc_stderr": 0.01423587248790987,
"acc_norm": 0.6501706484641638,
"acc_norm_stderr": 0.013936809212158292
},
"harness|hellaswag|10": {
"acc": 0.6489743079067914,
"acc_stderr": 0.004763155068744876,
"acc_norm": 0.8481378211511651,
"acc_norm_stderr": 0.0035815378475817913
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493878,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493878
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.02340092891831049,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.02340092891831049
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.02967090612463088,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.02967090612463088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374296,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374296
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.02126271940040696,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.02126271940040696
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.0133064782430663,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.0133064782430663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33519553072625696,
"acc_stderr": 0.015788007190185884,
"acc_norm": 0.33519553072625696,
"acc_norm_stderr": 0.015788007190185884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695814,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.01273239828619044,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.01273239828619044
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031215,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031215
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.018771683893528183,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.018771683893528183
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431455,
"mc2": 0.5706929434240026,
"mc2_stderr": 0.015037653624275078
},
"harness|winogrande|5": {
"acc": 0.8113654301499605,
"acc_stderr": 0.01099517231801981
},
"harness|gsm8k|5": {
"acc": 0.6868840030326004,
"acc_stderr": 0.012774285669385085
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AtAndDev__CapybaraMarcoroni-7B | [
"region:us"
] | 2024-01-07T21:53:09+00:00 | {"pretty_name": "Evaluation run of AtAndDev/CapybaraMarcoroni-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [AtAndDev/CapybaraMarcoroni-7B](https://huggingface.co/AtAndDev/CapybaraMarcoroni-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AtAndDev__CapybaraMarcoroni-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-07T21:50:49.600700](https://huggingface.co/datasets/open-llm-leaderboard/details_AtAndDev__CapybaraMarcoroni-7B/blob/main/results_2024-01-07T21-50-49.600700.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6545153984213022,\n \"acc_stderr\": 0.03210949302385312,\n \"acc_norm\": 0.6553202719640217,\n \"acc_norm_stderr\": 0.03276998196560543,\n \"mc1\": 0.390452876376989,\n \"mc1_stderr\": 0.017078230743431455,\n \"mc2\": 0.5706929434240026,\n \"mc2_stderr\": 0.015037653624275078\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6126279863481229,\n \"acc_stderr\": 0.01423587248790987,\n \"acc_norm\": 0.6501706484641638,\n \"acc_norm_stderr\": 0.013936809212158292\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6489743079067914,\n \"acc_stderr\": 0.004763155068744876,\n \"acc_norm\": 0.8481378211511651,\n \"acc_norm_stderr\": 0.0035815378475817913\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493878,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493878\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.02340092891831049,\n \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.02340092891831049\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3851851851851852,\n \"acc_stderr\": 0.02967090612463088,\n \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.02967090612463088\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374296,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374296\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.033851779760448106,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.033851779760448106\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.02126271940040696,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.02126271940040696\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.0133064782430663,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.0133064782430663\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33519553072625696,\n \"acc_stderr\": 0.015788007190185884,\n \"acc_norm\": 0.33519553072625696,\n \"acc_norm_stderr\": 0.015788007190185884\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.01273239828619044,\n \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.01273239828619044\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031215,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031215\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528183,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528183\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.390452876376989,\n \"mc1_stderr\": 0.017078230743431455,\n \"mc2\": 0.5706929434240026,\n \"mc2_stderr\": 0.015037653624275078\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.01099517231801981\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6868840030326004,\n \"acc_stderr\": 0.012774285669385085\n }\n}\n```", "repo_url": "https://huggingface.co/AtAndDev/CapybaraMarcoroni-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|arc:challenge|25_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|gsm8k|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hellaswag|10_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T21-50-49.600700.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["**/details_harness|winogrande|5_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-07T21-50-49.600700.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T21_50_49.600700", "path": ["results_2024-01-07T21-50-49.600700.parquet"]}, {"split": "latest", "path": ["results_2024-01-07T21-50-49.600700.parquet"]}]}]} | 2024-01-07T21:53:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AtAndDev/CapybaraMarcoroni-7B
Dataset automatically created during the evaluation run of model AtAndDev/CapybaraMarcoroni-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-07T21:50:49.600700(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AtAndDev/CapybaraMarcoroni-7B\n\n\n\nDataset automatically created during the evaluation run of model AtAndDev/CapybaraMarcoroni-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T21:50:49.600700(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AtAndDev/CapybaraMarcoroni-7B\n\n\n\nDataset automatically created during the evaluation run of model AtAndDev/CapybaraMarcoroni-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T21:50:49.600700(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of AtAndDev/CapybaraMarcoroni-7B\n\n\n\nDataset automatically created during the evaluation run of model AtAndDev/CapybaraMarcoroni-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-07T21:50:49.600700(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
293697c7b232d27b800e5800026fc984dabae5cf | Function Calling Dataset Based on UltraChat Format | isaiahbjork/function-calling | [
"region:us"
] | 2024-01-07T21:56:12+00:00 | {} | 2024-01-07T23:30:47+00:00 | [] | [] | TAGS
#region-us
| Function Calling Dataset Based on UltraChat Format | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
ef1ac5f1b2cf3e4e905f2fe25d4b418c766390c9 |
# Dataset Card for Evaluation run of HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca](https://huggingface.co/HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HenryJJ__Instruct_Yi-6B_Dolly_CodeAlpaca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T03:36:26.320528](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__Instruct_Yi-6B_Dolly_CodeAlpaca/blob/main/results_2024-01-08T03-36-26.320528.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6236571632946254,
"acc_stderr": 0.03222013618820489,
"acc_norm": 0.6309524776297984,
"acc_norm_stderr": 0.03288521739348617,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.0157021070906279,
"mc2": 0.41422968964840373,
"mc2_stderr": 0.014212709995879808
},
"harness|arc:challenge|25": {
"acc": 0.5025597269624573,
"acc_stderr": 0.014611199329843784,
"acc_norm": 0.5315699658703071,
"acc_norm_stderr": 0.014582236460866975
},
"harness|hellaswag|10": {
"acc": 0.5616411073491336,
"acc_stderr": 0.004951717622007979,
"acc_norm": 0.7530372435769767,
"acc_norm_stderr": 0.0043036354511158045
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339526,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339526
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4497354497354497,
"acc_stderr": 0.025620857042936655,
"acc_norm": 0.4497354497354497,
"acc_norm_stderr": 0.025620857042936655
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02655220782821529,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02655220782821529
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.02466674491518722,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.02466674491518722
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473072,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7478991596638656,
"acc_stderr": 0.028205545033277726,
"acc_norm": 0.7478991596638656,
"acc_norm_stderr": 0.028205545033277726
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.01612927102509985,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.01612927102509985
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899094,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899094
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35083798882681566,
"acc_stderr": 0.015961036675230966,
"acc_norm": 0.35083798882681566,
"acc_norm_stderr": 0.015961036675230966
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424434,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424434
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49022164276401564,
"acc_stderr": 0.012767793787729338,
"acc_norm": 0.49022164276401564,
"acc_norm_stderr": 0.012767793787729338
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696644,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696644
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.0157021070906279,
"mc2": 0.41422968964840373,
"mc2_stderr": 0.014212709995879808
},
"harness|winogrande|5": {
"acc": 0.7537490134175217,
"acc_stderr": 0.01210836530743752
},
"harness|gsm8k|5": {
"acc": 0.2835481425322214,
"acc_stderr": 0.012415070917508127
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_HenryJJ__Instruct_Yi-6B_Dolly_CodeAlpaca | [
"region:us"
] | 2024-01-07T22:01:26+00:00 | {"pretty_name": "Evaluation run of HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca", "dataset_summary": "Dataset automatically created during the evaluation run of model [HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca](https://huggingface.co/HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HenryJJ__Instruct_Yi-6B_Dolly_CodeAlpaca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-08T03:36:26.320528](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__Instruct_Yi-6B_Dolly_CodeAlpaca/blob/main/results_2024-01-08T03-36-26.320528.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6236571632946254,\n \"acc_stderr\": 0.03222013618820489,\n \"acc_norm\": 0.6309524776297984,\n \"acc_norm_stderr\": 0.03288521739348617,\n \"mc1\": 0.27906976744186046,\n \"mc1_stderr\": 0.0157021070906279,\n \"mc2\": 0.41422968964840373,\n \"mc2_stderr\": 0.014212709995879808\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5025597269624573,\n \"acc_stderr\": 0.014611199329843784,\n \"acc_norm\": 0.5315699658703071,\n \"acc_norm_stderr\": 0.014582236460866975\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5616411073491336,\n \"acc_stderr\": 0.004951717622007979,\n \"acc_norm\": 0.7530372435769767,\n \"acc_norm_stderr\": 0.0043036354511158045\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339526,\n \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339526\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4497354497354497,\n \"acc_stderr\": 0.025620857042936655,\n \"acc_norm\": 0.4497354497354497,\n \"acc_norm_stderr\": 0.025620857042936655\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.02655220782821529,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02655220782821529\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.02466674491518722,\n \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.02466674491518722\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7478991596638656,\n \"acc_stderr\": 0.028205545033277726,\n \"acc_norm\": 0.7478991596638656,\n \"acc_norm_stderr\": 0.028205545033277726\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8293577981651377,\n \"acc_stderr\": 0.01612927102509985,\n \"acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.01612927102509985\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899094,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899094\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35083798882681566,\n \"acc_stderr\": 0.015961036675230966,\n \"acc_norm\": 0.35083798882681566,\n \"acc_norm_stderr\": 0.015961036675230966\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424434,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424434\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49022164276401564,\n \"acc_stderr\": 0.012767793787729338,\n \"acc_norm\": 0.49022164276401564,\n \"acc_norm_stderr\": 0.012767793787729338\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696644,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696644\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n \"mc1_stderr\": 0.0157021070906279,\n \"mc2\": 0.41422968964840373,\n \"mc2_stderr\": 0.014212709995879808\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7537490134175217,\n \"acc_stderr\": 0.01210836530743752\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2835481425322214,\n \"acc_stderr\": 0.012415070917508127\n }\n}\n```", "repo_url": "https://huggingface.co/HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|arc:challenge|25_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|arc:challenge|25_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|gsm8k|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|gsm8k|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hellaswag|10_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hellaswag|10_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T21-59-12.253105.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T03-36-26.320528.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["**/details_harness|winogrande|5_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["**/details_harness|winogrande|5_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-08T03-36-26.320528.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T21_59_12.253105", "path": ["results_2024-01-07T21-59-12.253105.parquet"]}, {"split": "2024_01_08T03_36_26.320528", "path": ["results_2024-01-08T03-36-26.320528.parquet"]}, {"split": "latest", "path": ["results_2024-01-08T03-36-26.320528.parquet"]}]}]} | 2024-01-08T03:38:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca
Dataset automatically created during the evaluation run of model HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-08T03:36:26.320528(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca\n\n\n\nDataset automatically created during the evaluation run of model HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T03:36:26.320528(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca\n\n\n\nDataset automatically created during the evaluation run of model HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T03:36:26.320528(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
197,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca\n\n\n\nDataset automatically created during the evaluation run of model HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-08T03:36:26.320528(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
9e7d482a7acfeb51db0d0a73b9cb88f23ae65c9a |
# Dataset Card for Evaluation run of mlabonne/NeuralMarcoro14-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/NeuralMarcoro14-7B](https://huggingface.co/mlabonne/NeuralMarcoro14-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__NeuralMarcoro14-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-07T22:13:05.321347](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralMarcoro14-7B/blob/main/results_2024-01-07T22-13-05.321347.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6535384746419964,
"acc_stderr": 0.03202126688353153,
"acc_norm": 0.6533329638161822,
"acc_norm_stderr": 0.032681130488138685,
"mc1": 0.5091799265605875,
"mc1_stderr": 0.01750055072481976,
"mc2": 0.6564217699864116,
"mc2_stderr": 0.015360219658423699
},
"harness|arc:challenge|25": {
"acc": 0.689419795221843,
"acc_stderr": 0.013522292098053059,
"acc_norm": 0.7141638225255973,
"acc_norm_stderr": 0.013203196088537372
},
"harness|hellaswag|10": {
"acc": 0.7003584943238399,
"acc_stderr": 0.004571647137441118,
"acc_norm": 0.8759211312487553,
"acc_norm_stderr": 0.0032899775233939097
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033477,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033477
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476073,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476073
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8378033205619413,
"acc_stderr": 0.013182222616720883,
"acc_norm": 0.8378033205619413,
"acc_norm_stderr": 0.013182222616720883
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4324022346368715,
"acc_stderr": 0.016568971233548606,
"acc_norm": 0.4324022346368715,
"acc_norm_stderr": 0.016568971233548606
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.02573885479781873,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.02573885479781873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658537,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658537
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.01273239828619044,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.01273239828619044
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.0286619962023353,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.0286619962023353
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5091799265605875,
"mc1_stderr": 0.01750055072481976,
"mc2": 0.6564217699864116,
"mc2_stderr": 0.015360219658423699
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.010977481103435091
},
"harness|gsm8k|5": {
"acc": 0.7073540561031084,
"acc_stderr": 0.01253233436824289
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_mlabonne__NeuralMarcoro14-7B | [
"region:us"
] | 2024-01-07T22:15:22+00:00 | {"pretty_name": "Evaluation run of mlabonne/NeuralMarcoro14-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [mlabonne/NeuralMarcoro14-7B](https://huggingface.co/mlabonne/NeuralMarcoro14-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__NeuralMarcoro14-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-07T22:13:05.321347](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralMarcoro14-7B/blob/main/results_2024-01-07T22-13-05.321347.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6535384746419964,\n \"acc_stderr\": 0.03202126688353153,\n \"acc_norm\": 0.6533329638161822,\n \"acc_norm_stderr\": 0.032681130488138685,\n \"mc1\": 0.5091799265605875,\n \"mc1_stderr\": 0.01750055072481976,\n \"mc2\": 0.6564217699864116,\n \"mc2_stderr\": 0.015360219658423699\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.689419795221843,\n \"acc_stderr\": 0.013522292098053059,\n \"acc_norm\": 0.7141638225255973,\n \"acc_norm_stderr\": 0.013203196088537372\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7003584943238399,\n \"acc_stderr\": 0.004571647137441118,\n \"acc_norm\": 0.8759211312487553,\n \"acc_norm_stderr\": 0.0032899775233939097\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033477,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033477\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476073,\n \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476073\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8378033205619413,\n \"acc_stderr\": 0.013182222616720883,\n \"acc_norm\": 0.8378033205619413,\n \"acc_norm_stderr\": 0.013182222616720883\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n \"acc_stderr\": 0.016568971233548606,\n \"acc_norm\": 0.4324022346368715,\n \"acc_norm_stderr\": 0.016568971233548606\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.02573885479781873,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.02573885479781873\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658537,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658537\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.01273239828619044,\n \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.01273239828619044\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.0286619962023353,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.0286619962023353\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5091799265605875,\n \"mc1_stderr\": 0.01750055072481976,\n \"mc2\": 0.6564217699864116,\n \"mc2_stderr\": 0.015360219658423699\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435091\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7073540561031084,\n \"acc_stderr\": 0.01253233436824289\n }\n}\n```", "repo_url": "https://huggingface.co/mlabonne/NeuralMarcoro14-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|arc:challenge|25_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|gsm8k|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hellaswag|10_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T22-13-05.321347.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["**/details_harness|winogrande|5_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-07T22-13-05.321347.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T22_13_05.321347", "path": ["results_2024-01-07T22-13-05.321347.parquet"]}, {"split": "latest", "path": ["results_2024-01-07T22-13-05.321347.parquet"]}]}]} | 2024-01-07T22:15:41+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mlabonne/NeuralMarcoro14-7B
Dataset automatically created during the evaluation run of model mlabonne/NeuralMarcoro14-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-07T22:13:05.321347(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of mlabonne/NeuralMarcoro14-7B\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/NeuralMarcoro14-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T22:13:05.321347(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mlabonne/NeuralMarcoro14-7B\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/NeuralMarcoro14-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T22:13:05.321347(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
183,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mlabonne/NeuralMarcoro14-7B\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/NeuralMarcoro14-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-07T22:13:05.321347(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
7bc1209d4943e844efd35b2502598718d0e661d7 |
# Dataset Card for Evaluation run of alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo](https://huggingface.co/alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alexredna__TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T12:31:49.515266](https://huggingface.co/datasets/open-llm-leaderboard/details_alexredna__TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo/blob/main/results_2024-01-10T12-31-49.515266.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26917444066050017,
"acc_stderr": 0.03119520113126439,
"acc_norm": 0.2707410918610733,
"acc_norm_stderr": 0.032026678164616795,
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123897,
"mc2": 0.36126081259496323,
"mc2_stderr": 0.013694123437880635
},
"harness|arc:challenge|25": {
"acc": 0.31313993174061433,
"acc_stderr": 0.013552671543623501,
"acc_norm": 0.3438566552901024,
"acc_norm_stderr": 0.013880644570156213
},
"harness|hellaswag|10": {
"acc": 0.4607647878908584,
"acc_stderr": 0.004974395131539585,
"acc_norm": 0.6187014538936467,
"acc_norm_stderr": 0.004847129907908675
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17037037037037037,
"acc_stderr": 0.032477811859955935,
"acc_norm": 0.17037037037037037,
"acc_norm_stderr": 0.032477811859955935
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.027134291628741702,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.027134291628741702
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.034370793441061344,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.034370793441061344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749916,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749916
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993179,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993179
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28085106382978725,
"acc_stderr": 0.029379170464124818,
"acc_norm": 0.28085106382978725,
"acc_norm_stderr": 0.029379170464124818
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0383515395439942,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0383515395439942
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438014,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438014
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776578,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776578
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.26129032258064516,
"acc_stderr": 0.02499305339776482,
"acc_norm": 0.26129032258064516,
"acc_norm_stderr": 0.02499305339776482
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233483,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233483
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.27692307692307694,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.27692307692307694,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712177,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712177
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31092436974789917,
"acc_stderr": 0.030066761582977938,
"acc_norm": 0.31092436974789917,
"acc_norm_stderr": 0.030066761582977938
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22568807339449543,
"acc_stderr": 0.017923087667803053,
"acc_norm": 0.22568807339449543,
"acc_norm_stderr": 0.017923087667803053
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501943,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501943
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29957805907172996,
"acc_stderr": 0.029818024749753095,
"acc_norm": 0.29957805907172996,
"acc_norm_stderr": 0.029818024749753095
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3632286995515695,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.3632286995515695,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.036412970813137276,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.036412970813137276
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.21487603305785125,
"acc_stderr": 0.03749492448709698,
"acc_norm": 0.21487603305785125,
"acc_norm_stderr": 0.03749492448709698
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052191,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052191
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23504273504273504,
"acc_stderr": 0.027778835904935437,
"acc_norm": 0.23504273504273504,
"acc_norm_stderr": 0.027778835904935437
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2796934865900383,
"acc_stderr": 0.01605079214803655,
"acc_norm": 0.2796934865900383,
"acc_norm_stderr": 0.01605079214803655
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21676300578034682,
"acc_stderr": 0.02218347766841285,
"acc_norm": 0.21676300578034682,
"acc_norm_stderr": 0.02218347766841285
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23575418994413408,
"acc_stderr": 0.014196375686290804,
"acc_norm": 0.23575418994413408,
"acc_norm_stderr": 0.014196375686290804
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2973856209150327,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.2973856209150327,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2572347266881029,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.2572347266881029,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2808641975308642,
"acc_stderr": 0.025006469755799204,
"acc_norm": 0.2808641975308642,
"acc_norm_stderr": 0.025006469755799204
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24468085106382978,
"acc_stderr": 0.025645553622266733,
"acc_norm": 0.24468085106382978,
"acc_norm_stderr": 0.025645553622266733
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2470664928292047,
"acc_stderr": 0.011015752255279333,
"acc_norm": 0.2470664928292047,
"acc_norm_stderr": 0.011015752255279333
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.028418208619406794,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.028418208619406794
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2761437908496732,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.2761437908496732,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.21224489795918366,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.21224489795918366,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573023,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573023
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3072289156626506,
"acc_stderr": 0.035915667978246635,
"acc_norm": 0.3072289156626506,
"acc_norm_stderr": 0.035915667978246635
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.18128654970760233,
"acc_stderr": 0.029547741687640024,
"acc_norm": 0.18128654970760233,
"acc_norm_stderr": 0.029547741687640024
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123897,
"mc2": 0.36126081259496323,
"mc2_stderr": 0.013694123437880635
},
"harness|winogrande|5": {
"acc": 0.6345698500394633,
"acc_stderr": 0.013533965097638776
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_alexredna__TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo | [
"region:us"
] | 2024-01-07T22:17:01+00:00 | {"pretty_name": "Evaluation run of alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo](https://huggingface.co/alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alexredna__TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-10T12:31:49.515266](https://huggingface.co/datasets/open-llm-leaderboard/details_alexredna__TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo/blob/main/results_2024-01-10T12-31-49.515266.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26917444066050017,\n \"acc_stderr\": 0.03119520113126439,\n \"acc_norm\": 0.2707410918610733,\n \"acc_norm_stderr\": 0.032026678164616795,\n \"mc1\": 0.2178702570379437,\n \"mc1_stderr\": 0.014450846714123897,\n \"mc2\": 0.36126081259496323,\n \"mc2_stderr\": 0.013694123437880635\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.31313993174061433,\n \"acc_stderr\": 0.013552671543623501,\n \"acc_norm\": 0.3438566552901024,\n \"acc_norm_stderr\": 0.013880644570156213\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4607647878908584,\n \"acc_stderr\": 0.004974395131539585,\n \"acc_norm\": 0.6187014538936467,\n \"acc_norm_stderr\": 0.004847129907908675\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17037037037037037,\n \"acc_stderr\": 0.032477811859955935,\n \"acc_norm\": 0.17037037037037037,\n \"acc_norm_stderr\": 0.032477811859955935\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.027134291628741702,\n \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.027134291628741702\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n \"acc_stderr\": 0.034370793441061344,\n \"acc_norm\": 0.2152777777777778,\n \"acc_norm_stderr\": 0.034370793441061344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749916,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749916\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993179,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993179\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.029379170464124818,\n \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.029379170464124818\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0383515395439942,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0383515395439942\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438014,\n \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438014\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776578,\n \"acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776578\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.26129032258064516,\n \"acc_stderr\": 0.02499305339776482,\n \"acc_norm\": 0.26129032258064516,\n \"acc_norm_stderr\": 0.02499305339776482\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233483,\n \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233483\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.035014387062967806,\n \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.035014387062967806\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.27692307692307694,\n \"acc_stderr\": 0.022688042352424994,\n \"acc_norm\": 0.27692307692307694,\n \"acc_norm_stderr\": 0.022688042352424994\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712177,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712177\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.31092436974789917,\n \"acc_stderr\": 0.030066761582977938,\n \"acc_norm\": 0.31092436974789917,\n \"acc_norm_stderr\": 0.030066761582977938\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22568807339449543,\n \"acc_stderr\": 0.017923087667803053,\n \"acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.017923087667803053\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501943,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501943\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.29957805907172996,\n \"acc_stderr\": 0.029818024749753095,\n \"acc_norm\": 0.29957805907172996,\n \"acc_norm_stderr\": 0.029818024749753095\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.3632286995515695,\n \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.036412970813137276,\n \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.036412970813137276\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.21487603305785125,\n \"acc_stderr\": 0.03749492448709698,\n \"acc_norm\": 0.21487603305785125,\n \"acc_norm_stderr\": 0.03749492448709698\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052191,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052191\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615624,\n \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615624\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23504273504273504,\n \"acc_stderr\": 0.027778835904935437,\n \"acc_norm\": 0.23504273504273504,\n \"acc_norm_stderr\": 0.027778835904935437\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2796934865900383,\n \"acc_stderr\": 0.01605079214803655,\n \"acc_norm\": 0.2796934865900383,\n \"acc_norm_stderr\": 0.01605079214803655\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.21676300578034682,\n \"acc_stderr\": 0.02218347766841285,\n \"acc_norm\": 0.21676300578034682,\n \"acc_norm_stderr\": 0.02218347766841285\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23575418994413408,\n \"acc_stderr\": 0.014196375686290804,\n \"acc_norm\": 0.23575418994413408,\n \"acc_norm_stderr\": 0.014196375686290804\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2973856209150327,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.2973856209150327,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2572347266881029,\n \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.2572347266881029,\n \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2808641975308642,\n \"acc_stderr\": 0.025006469755799204,\n \"acc_norm\": 0.2808641975308642,\n \"acc_norm_stderr\": 0.025006469755799204\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24468085106382978,\n \"acc_stderr\": 0.025645553622266733,\n \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.025645553622266733\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n \"acc_stderr\": 0.011015752255279333,\n \"acc_norm\": 0.2470664928292047,\n \"acc_norm_stderr\": 0.011015752255279333\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.028418208619406794,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.028418208619406794\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2761437908496732,\n \"acc_stderr\": 0.018087276935663137,\n \"acc_norm\": 0.2761437908496732,\n \"acc_norm_stderr\": 0.018087276935663137\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.33636363636363636,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.21224489795918366,\n \"acc_stderr\": 0.026176967197866767,\n \"acc_norm\": 0.21224489795918366,\n \"acc_norm_stderr\": 0.026176967197866767\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n \"acc_stderr\": 0.030965903123573023,\n \"acc_norm\": 0.25870646766169153,\n \"acc_norm_stderr\": 0.030965903123573023\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3072289156626506,\n \"acc_stderr\": 0.035915667978246635,\n \"acc_norm\": 0.3072289156626506,\n \"acc_norm_stderr\": 0.035915667978246635\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.18128654970760233,\n \"acc_stderr\": 0.029547741687640024,\n \"acc_norm\": 0.18128654970760233,\n \"acc_norm_stderr\": 0.029547741687640024\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2178702570379437,\n \"mc1_stderr\": 0.014450846714123897,\n \"mc2\": 0.36126081259496323,\n \"mc2_stderr\": 0.013694123437880635\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6345698500394633,\n \"acc_stderr\": 0.013533965097638776\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|arc:challenge|25_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|arc:challenge|25_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|gsm8k|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|gsm8k|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hellaswag|10_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hellaswag|10_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T22-15-13.499514.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-10T12-31-49.515266.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["**/details_harness|winogrande|5_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["**/details_harness|winogrande|5_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-10T12-31-49.515266.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T22_15_13.499514", "path": ["results_2024-01-07T22-15-13.499514.parquet"]}, {"split": "2024_01_10T12_31_49.515266", "path": ["results_2024-01-10T12-31-49.515266.parquet"]}, {"split": "latest", "path": ["results_2024-01-10T12-31-49.515266.parquet"]}]}]} | 2024-01-10T12:33:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo
Dataset automatically created during the evaluation run of model alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-10T12:31:49.515266(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo\n\n\n\nDataset automatically created during the evaluation run of model alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-10T12:31:49.515266(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo\n\n\n\nDataset automatically created during the evaluation run of model alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-10T12:31:49.515266(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
213,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo\n\n\n\nDataset automatically created during the evaluation run of model alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-10T12:31:49.515266(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:"
] |
7e1775c4b24942fe685dc49f6e541c36ccca23d4 |
# Dataset Card for Evaluation run of ryandt/MusingCaterpillar
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ryandt/MusingCaterpillar](https://huggingface.co/ryandt/MusingCaterpillar) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ryandt__MusingCaterpillar",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-07T22:15:17.631393](https://huggingface.co/datasets/open-llm-leaderboard/details_ryandt__MusingCaterpillar/blob/main/results_2024-01-07T22-15-17.631393.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6565329262711282,
"acc_stderr": 0.03189924500320495,
"acc_norm": 0.6577462207289702,
"acc_norm_stderr": 0.03253908424103263,
"mc1": 0.5703794369645043,
"mc1_stderr": 0.01732923458040909,
"mc2": 0.7092826239626928,
"mc2_stderr": 0.015026732524325976
},
"harness|arc:challenge|25": {
"acc": 0.7030716723549488,
"acc_stderr": 0.013352025976725225,
"acc_norm": 0.7252559726962458,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.7179844652459669,
"acc_stderr": 0.004490612245335218,
"acc_norm": 0.8833897629954193,
"acc_norm_stderr": 0.003202993346991063
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138215,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138215
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250458,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250458
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476074,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476074
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.01346820161406629,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.01346820161406629
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265023,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265023
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46145251396648046,
"acc_stderr": 0.016672731267552258,
"acc_norm": 0.46145251396648046,
"acc_norm_stderr": 0.016672731267552258
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4641460234680574,
"acc_stderr": 0.012737361318730583,
"acc_norm": 0.4641460234680574,
"acc_norm_stderr": 0.012737361318730583
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.01913994374848704,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.01913994374848704
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5703794369645043,
"mc1_stderr": 0.01732923458040909,
"mc2": 0.7092826239626928,
"mc2_stderr": 0.015026732524325976
},
"harness|winogrande|5": {
"acc": 0.8066298342541437,
"acc_stderr": 0.011099796645920531
},
"harness|gsm8k|5": {
"acc": 0.6224412433661866,
"acc_stderr": 0.01335315066635854
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ryandt__MusingCaterpillar | [
"region:us"
] | 2024-01-07T22:17:30+00:00 | {"pretty_name": "Evaluation run of ryandt/MusingCaterpillar", "dataset_summary": "Dataset automatically created during the evaluation run of model [ryandt/MusingCaterpillar](https://huggingface.co/ryandt/MusingCaterpillar) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ryandt__MusingCaterpillar\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-07T22:15:17.631393](https://huggingface.co/datasets/open-llm-leaderboard/details_ryandt__MusingCaterpillar/blob/main/results_2024-01-07T22-15-17.631393.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6565329262711282,\n \"acc_stderr\": 0.03189924500320495,\n \"acc_norm\": 0.6577462207289702,\n \"acc_norm_stderr\": 0.03253908424103263,\n \"mc1\": 0.5703794369645043,\n \"mc1_stderr\": 0.01732923458040909,\n \"mc2\": 0.7092826239626928,\n \"mc2_stderr\": 0.015026732524325976\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7030716723549488,\n \"acc_stderr\": 0.013352025976725225,\n \"acc_norm\": 0.7252559726962458,\n \"acc_norm_stderr\": 0.013044617212771227\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7179844652459669,\n \"acc_stderr\": 0.004490612245335218,\n \"acc_norm\": 0.8833897629954193,\n \"acc_norm_stderr\": 0.003202993346991063\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.038781398887976104,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.038781398887976104\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138215,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138215\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657262,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657262\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250458,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250458\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476074,\n \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476074\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.01346820161406629,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.01346820161406629\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265023,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265023\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46145251396648046,\n \"acc_stderr\": 0.016672731267552258,\n \"acc_norm\": 0.46145251396648046,\n \"acc_norm_stderr\": 0.016672731267552258\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4641460234680574,\n \"acc_stderr\": 0.012737361318730583,\n \"acc_norm\": 0.4641460234680574,\n \"acc_norm_stderr\": 0.012737361318730583\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.01913994374848704,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.01913994374848704\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5703794369645043,\n \"mc1_stderr\": 0.01732923458040909,\n \"mc2\": 0.7092826239626928,\n \"mc2_stderr\": 0.015026732524325976\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8066298342541437,\n \"acc_stderr\": 0.011099796645920531\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6224412433661866,\n \"acc_stderr\": 0.01335315066635854\n }\n}\n```", "repo_url": "https://huggingface.co/ryandt/MusingCaterpillar", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|arc:challenge|25_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|gsm8k|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hellaswag|10_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T22-15-17.631393.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["**/details_harness|winogrande|5_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-07T22-15-17.631393.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T22_15_17.631393", "path": ["results_2024-01-07T22-15-17.631393.parquet"]}, {"split": "latest", "path": ["results_2024-01-07T22-15-17.631393.parquet"]}]}]} | 2024-01-07T22:17:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ryandt/MusingCaterpillar
Dataset automatically created during the evaluation run of model ryandt/MusingCaterpillar on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-07T22:15:17.631393(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ryandt/MusingCaterpillar\n\n\n\nDataset automatically created during the evaluation run of model ryandt/MusingCaterpillar on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T22:15:17.631393(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ryandt/MusingCaterpillar\n\n\n\nDataset automatically created during the evaluation run of model ryandt/MusingCaterpillar on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T22:15:17.631393(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ryandt/MusingCaterpillar\n\n\n\nDataset automatically created during the evaluation run of model ryandt/MusingCaterpillar on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-07T22:15:17.631393(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
8984d384ba49177e23473d2a7083c39a0b248b14 |
# Dataset Card for Evaluation run of NeverSleep/Noromaid-13b-v0.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NeverSleep/Noromaid-13b-v0.3](https://huggingface.co/NeverSleep/Noromaid-13b-v0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NeverSleep__Noromaid-13b-v0.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T08:43:54.536488](https://huggingface.co/datasets/open-llm-leaderboard/details_NeverSleep__Noromaid-13b-v0.3/blob/main/results_2024-01-08T08-43-54.536488.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5677987077394565,
"acc_stderr": 0.033653954046911065,
"acc_norm": 0.5743169734927792,
"acc_norm_stderr": 0.034368230343916395,
"mc1": 0.35495716034271724,
"mc1_stderr": 0.0167508623813759,
"mc2": 0.5073138068542993,
"mc2_stderr": 0.015726117257006858
},
"harness|arc:challenge|25": {
"acc": 0.5972696245733788,
"acc_stderr": 0.01433223630679015,
"acc_norm": 0.6279863481228669,
"acc_norm_stderr": 0.014124597881844461
},
"harness|hellaswag|10": {
"acc": 0.6479784903405696,
"acc_stderr": 0.004766245539606633,
"acc_norm": 0.8441545508862777,
"acc_norm_stderr": 0.0036196748640350256
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374768,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374768
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.024594975128920935,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.024594975128920935
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.02686020644472434,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.02686020644472434
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.02717121368316455,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.02717121368316455
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5435897435897435,
"acc_stderr": 0.02525448542479961,
"acc_norm": 0.5435897435897435,
"acc_norm_stderr": 0.02525448542479961
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028604,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028604
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236153,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236153
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7467889908256881,
"acc_stderr": 0.01864407304137504,
"acc_norm": 0.7467889908256881,
"acc_norm_stderr": 0.01864407304137504
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243739,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243739
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.02798569938703643,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.02798569938703643
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7586206896551724,
"acc_stderr": 0.015302380123542106,
"acc_norm": 0.7586206896551724,
"acc_norm_stderr": 0.015302380123542106
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895817,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895817
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4592178770949721,
"acc_stderr": 0.016666783616525772,
"acc_norm": 0.4592178770949721,
"acc_norm_stderr": 0.016666783616525772
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.027245613047215355,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.027245613047215355
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630988,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630988
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.026915003011380154,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.026915003011380154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4335071707953064,
"acc_stderr": 0.012656810383983965,
"acc_norm": 0.4335071707953064,
"acc_norm_stderr": 0.012656810383983965
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5404411764705882,
"acc_stderr": 0.03027332507734575,
"acc_norm": 0.5404411764705882,
"acc_norm_stderr": 0.03027332507734575
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.020017629214213094,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.020017629214213094
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.046737523336702384,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.046737523336702384
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.03086214492108756,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.03086214492108756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014638,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014638
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35495716034271724,
"mc1_stderr": 0.0167508623813759,
"mc2": 0.5073138068542993,
"mc2_stderr": 0.015726117257006858
},
"harness|winogrande|5": {
"acc": 0.7474348855564326,
"acc_stderr": 0.012211148449394105
},
"harness|gsm8k|5": {
"acc": 0.2304776345716452,
"acc_stderr": 0.011600249020595825
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NeverSleep__Noromaid-13b-v0.3 | [
"region:us"
] | 2024-01-07T22:18:20+00:00 | {"pretty_name": "Evaluation run of NeverSleep/Noromaid-13b-v0.3", "dataset_summary": "Dataset automatically created during the evaluation run of model [NeverSleep/Noromaid-13b-v0.3](https://huggingface.co/NeverSleep/Noromaid-13b-v0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeverSleep__Noromaid-13b-v0.3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-08T08:43:54.536488](https://huggingface.co/datasets/open-llm-leaderboard/details_NeverSleep__Noromaid-13b-v0.3/blob/main/results_2024-01-08T08-43-54.536488.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5677987077394565,\n \"acc_stderr\": 0.033653954046911065,\n \"acc_norm\": 0.5743169734927792,\n \"acc_norm_stderr\": 0.034368230343916395,\n \"mc1\": 0.35495716034271724,\n \"mc1_stderr\": 0.0167508623813759,\n \"mc2\": 0.5073138068542993,\n \"mc2_stderr\": 0.015726117257006858\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5972696245733788,\n \"acc_stderr\": 0.01433223630679015,\n \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844461\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6479784903405696,\n \"acc_stderr\": 0.004766245539606633,\n \"acc_norm\": 0.8441545508862777,\n \"acc_norm_stderr\": 0.0036196748640350256\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.04404556157374768,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.04404556157374768\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.024594975128920935,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.024594975128920935\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n \"acc_stderr\": 0.02686020644472434,\n \"acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.02686020644472434\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316455,\n \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316455\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5435897435897435,\n \"acc_stderr\": 0.02525448542479961,\n \"acc_norm\": 0.5435897435897435,\n \"acc_norm_stderr\": 0.02525448542479961\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028604,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028604\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236153,\n \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236153\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7467889908256881,\n \"acc_stderr\": 0.01864407304137504,\n \"acc_norm\": 0.7467889908256881,\n \"acc_norm_stderr\": 0.01864407304137504\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243739,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243739\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.02798569938703643,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.02798569938703643\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7586206896551724,\n \"acc_stderr\": 0.015302380123542106,\n \"acc_norm\": 0.7586206896551724,\n \"acc_norm_stderr\": 0.015302380123542106\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895817,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895817\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4592178770949721,\n \"acc_stderr\": 0.016666783616525772,\n \"acc_norm\": 0.4592178770949721,\n \"acc_norm_stderr\": 0.016666783616525772\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.027245613047215355,\n \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.027245613047215355\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n \"acc_stderr\": 0.027417996705630988,\n \"acc_norm\": 0.6302250803858521,\n \"acc_norm_stderr\": 0.027417996705630988\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380154,\n \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380154\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766,\n \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n \"acc_stderr\": 0.012656810383983965,\n \"acc_norm\": 0.4335071707953064,\n \"acc_norm_stderr\": 0.012656810383983965\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.03027332507734575,\n \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.03027332507734575\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5718954248366013,\n \"acc_stderr\": 0.020017629214213094,\n \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.020017629214213094\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.046737523336702384,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.046737523336702384\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n \"acc_stderr\": 0.030360490154014638,\n \"acc_norm\": 0.7562189054726368,\n \"acc_norm_stderr\": 0.030360490154014638\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35495716034271724,\n \"mc1_stderr\": 0.0167508623813759,\n \"mc2\": 0.5073138068542993,\n \"mc2_stderr\": 0.015726117257006858\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7474348855564326,\n \"acc_stderr\": 0.012211148449394105\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2304776345716452,\n \"acc_stderr\": 0.011600249020595825\n }\n}\n```", "repo_url": "https://huggingface.co/NeverSleep/Noromaid-13b-v0.3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|arc:challenge|25_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|arc:challenge|25_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|gsm8k|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|gsm8k|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hellaswag|10_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hellaswag|10_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T22-16-01.123734.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T08-43-54.536488.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["**/details_harness|winogrande|5_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["**/details_harness|winogrande|5_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-08T08-43-54.536488.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T22_16_01.123734", "path": ["results_2024-01-07T22-16-01.123734.parquet"]}, {"split": "2024_01_08T08_43_54.536488", "path": ["results_2024-01-08T08-43-54.536488.parquet"]}, {"split": "latest", "path": ["results_2024-01-08T08-43-54.536488.parquet"]}]}]} | 2024-01-08T08:46:17+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of NeverSleep/Noromaid-13b-v0.3
Dataset automatically created during the evaluation run of model NeverSleep/Noromaid-13b-v0.3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-08T08:43:54.536488(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of NeverSleep/Noromaid-13b-v0.3\n\n\n\nDataset automatically created during the evaluation run of model NeverSleep/Noromaid-13b-v0.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T08:43:54.536488(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NeverSleep/Noromaid-13b-v0.3\n\n\n\nDataset automatically created during the evaluation run of model NeverSleep/Noromaid-13b-v0.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T08:43:54.536488(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of NeverSleep/Noromaid-13b-v0.3\n\n\n\nDataset automatically created during the evaluation run of model NeverSleep/Noromaid-13b-v0.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-08T08:43:54.536488(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
3f8fba8d9e9a11ebf328a8b3e693ec141d39db3a |
# Dataset Card for Evaluation run of sreeramajay/TinyLlama-1.1B-orca-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sreeramajay/TinyLlama-1.1B-orca-v1.0](https://huggingface.co/sreeramajay/TinyLlama-1.1B-orca-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sreeramajay__TinyLlama-1.1B-orca-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T02:53:34.794924](https://huggingface.co/datasets/open-llm-leaderboard/details_sreeramajay__TinyLlama-1.1B-orca-v1.0/blob/main/results_2024-01-08T02-53-34.794924.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25886592240119943,
"acc_stderr": 0.030852834645833542,
"acc_norm": 0.2598219974756462,
"acc_norm_stderr": 0.03159832550179831,
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.3657842399449893,
"mc2_stderr": 0.013644177619439266
},
"harness|arc:challenge|25": {
"acc": 0.3438566552901024,
"acc_stderr": 0.013880644570156211,
"acc_norm": 0.363481228668942,
"acc_norm_stderr": 0.014056207319068283
},
"harness|hellaswag|10": {
"acc": 0.45648277235610435,
"acc_stderr": 0.004970846697552306,
"acc_norm": 0.6123282214698267,
"acc_norm_stderr": 0.004862232790041579
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.033556772163131424,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.033556772163131424
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123387,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123387
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.1907514450867052,
"acc_stderr": 0.02995785132986934,
"acc_norm": 0.1907514450867052,
"acc_norm_stderr": 0.02995785132986934
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707841,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707841
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.023000086859068642,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.023000086859068642
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302054,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302054
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23870967741935484,
"acc_stderr": 0.024251071262208834,
"acc_norm": 0.23870967741935484,
"acc_norm_stderr": 0.024251071262208834
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733552,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733552
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.03567969772268047,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.03567969772268047
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24352331606217617,
"acc_stderr": 0.030975436386845447,
"acc_norm": 0.24352331606217617,
"acc_norm_stderr": 0.030975436386845447
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.258974358974359,
"acc_stderr": 0.02221110681006167,
"acc_norm": 0.258974358974359,
"acc_norm_stderr": 0.02221110681006167
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.02738140692786896,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.02738140692786896
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24036697247706423,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.24036697247706423,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695053,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695053
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3542600896860987,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.3542600896860987,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.041858325989283164,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.041858325989283164
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.27350427350427353,
"acc_stderr": 0.029202540153431166,
"acc_norm": 0.27350427350427353,
"acc_norm_stderr": 0.029202540153431166
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.280970625798212,
"acc_stderr": 0.016073127851221246,
"acc_norm": 0.280970625798212,
"acc_norm_stderr": 0.016073127851221246
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22254335260115607,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.22254335260115607,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859926,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859926
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.02463004897982477,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.02463004897982477
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2958199356913183,
"acc_stderr": 0.025922371788818777,
"acc_norm": 0.2958199356913183,
"acc_norm_stderr": 0.025922371788818777
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.20921985815602837,
"acc_stderr": 0.024264769439988496,
"acc_norm": 0.20921985815602837,
"acc_norm_stderr": 0.024264769439988496
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.01089612365267665,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.01089612365267665
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2426470588235294,
"acc_stderr": 0.026040662474201264,
"acc_norm": 0.2426470588235294,
"acc_norm_stderr": 0.026040662474201264
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.272875816993464,
"acc_stderr": 0.01802047414839358,
"acc_norm": 0.272875816993464,
"acc_norm_stderr": 0.01802047414839358
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.15918367346938775,
"acc_stderr": 0.02342097206916636,
"acc_norm": 0.15918367346938775,
"acc_norm_stderr": 0.02342097206916636
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.03610805018031024,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.03610805018031024
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824565,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824565
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.3657842399449893,
"mc2_stderr": 0.013644177619439266
},
"harness|winogrande|5": {
"acc": 0.6140489344909235,
"acc_stderr": 0.013682036993397402
},
"harness|gsm8k|5": {
"acc": 0.022744503411675512,
"acc_stderr": 0.004106620637749701
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_sreeramajay__TinyLlama-1.1B-orca-v1.0 | [
"region:us"
] | 2024-01-07T22:20:39+00:00 | {"pretty_name": "Evaluation run of sreeramajay/TinyLlama-1.1B-orca-v1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [sreeramajay/TinyLlama-1.1B-orca-v1.0](https://huggingface.co/sreeramajay/TinyLlama-1.1B-orca-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sreeramajay__TinyLlama-1.1B-orca-v1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-08T02:53:34.794924](https://huggingface.co/datasets/open-llm-leaderboard/details_sreeramajay__TinyLlama-1.1B-orca-v1.0/blob/main/results_2024-01-08T02-53-34.794924.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25886592240119943,\n \"acc_stderr\": 0.030852834645833542,\n \"acc_norm\": 0.2598219974756462,\n \"acc_norm_stderr\": 0.03159832550179831,\n \"mc1\": 0.22888616891064872,\n \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.3657842399449893,\n \"mc2_stderr\": 0.013644177619439266\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3438566552901024,\n \"acc_stderr\": 0.013880644570156211,\n \"acc_norm\": 0.363481228668942,\n \"acc_norm_stderr\": 0.014056207319068283\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45648277235610435,\n \"acc_stderr\": 0.004970846697552306,\n \"acc_norm\": 0.6123282214698267,\n \"acc_norm_stderr\": 0.004862232790041579\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.033556772163131424,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.033556772163131424\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123387,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123387\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.1907514450867052,\n \"acc_stderr\": 0.02995785132986934,\n \"acc_norm\": 0.1907514450867052,\n \"acc_norm_stderr\": 0.02995785132986934\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707841,\n \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707841\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2751322751322751,\n \"acc_stderr\": 0.023000086859068642,\n \"acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.023000086859068642\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n \"acc_stderr\": 0.03512207412302054,\n \"acc_norm\": 0.19047619047619047,\n \"acc_norm_stderr\": 0.03512207412302054\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23870967741935484,\n \"acc_stderr\": 0.024251071262208834,\n \"acc_norm\": 0.23870967741935484,\n \"acc_norm_stderr\": 0.024251071262208834\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.03567969772268047,\n \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.03567969772268047\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.02912652283458682,\n \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.02912652283458682\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.030975436386845447,\n \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.030975436386845447\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.258974358974359,\n \"acc_stderr\": 0.02221110681006167,\n \"acc_norm\": 0.258974358974359,\n \"acc_norm_stderr\": 0.02221110681006167\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.02738140692786896,\n \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.02738140692786896\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24036697247706423,\n \"acc_stderr\": 0.01832060732096407,\n \"acc_norm\": 0.24036697247706423,\n \"acc_norm_stderr\": 0.01832060732096407\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695053,\n \"acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695053\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3542600896860987,\n \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.3542600896860987,\n \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\": 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.041858325989283164,\n \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.041858325989283164\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.27350427350427353,\n \"acc_stderr\": 0.029202540153431166,\n \"acc_norm\": 0.27350427350427353,\n \"acc_norm_stderr\": 0.029202540153431166\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.280970625798212,\n \"acc_stderr\": 0.016073127851221246,\n \"acc_norm\": 0.280970625798212,\n \"acc_norm_stderr\": 0.016073127851221246\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.22254335260115607,\n \"acc_stderr\": 0.02239421566194282,\n \"acc_norm\": 0.22254335260115607,\n \"acc_norm_stderr\": 0.02239421566194282\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n \"acc_stderr\": 0.014465893829859926,\n \"acc_norm\": 0.24916201117318434,\n \"acc_norm_stderr\": 0.014465893829859926\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.02463004897982477,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.02463004897982477\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2958199356913183,\n \"acc_stderr\": 0.025922371788818777,\n \"acc_norm\": 0.2958199356913183,\n \"acc_norm_stderr\": 0.025922371788818777\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.20921985815602837,\n \"acc_stderr\": 0.024264769439988496,\n \"acc_norm\": 0.20921985815602837,\n \"acc_norm_stderr\": 0.024264769439988496\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n \"acc_stderr\": 0.01089612365267665,\n \"acc_norm\": 0.2392438070404172,\n \"acc_norm_stderr\": 0.01089612365267665\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2426470588235294,\n \"acc_stderr\": 0.026040662474201264,\n \"acc_norm\": 0.2426470588235294,\n \"acc_norm_stderr\": 0.026040662474201264\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.272875816993464,\n \"acc_stderr\": 0.01802047414839358,\n \"acc_norm\": 0.272875816993464,\n \"acc_norm_stderr\": 0.01802047414839358\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.15918367346938775,\n \"acc_stderr\": 0.02342097206916636,\n \"acc_norm\": 0.15918367346938775,\n \"acc_norm_stderr\": 0.02342097206916636\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n \"acc_stderr\": 0.03610805018031024,\n \"acc_norm\": 0.3132530120481928,\n \"acc_norm_stderr\": 0.03610805018031024\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824565,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824565\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22888616891064872,\n \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.3657842399449893,\n \"mc2_stderr\": 0.013644177619439266\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6140489344909235,\n \"acc_stderr\": 0.013682036993397402\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.022744503411675512,\n \"acc_stderr\": 0.004106620637749701\n }\n}\n```", "repo_url": "https://huggingface.co/sreeramajay/TinyLlama-1.1B-orca-v1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|arc:challenge|25_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|arc:challenge|25_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|gsm8k|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|gsm8k|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hellaswag|10_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hellaswag|10_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T22-18-48.841310.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T02-53-34.794924.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["**/details_harness|winogrande|5_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["**/details_harness|winogrande|5_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-08T02-53-34.794924.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T22_18_48.841310", "path": ["results_2024-01-07T22-18-48.841310.parquet"]}, {"split": "2024_01_08T02_53_34.794924", "path": ["results_2024-01-08T02-53-34.794924.parquet"]}, {"split": "latest", "path": ["results_2024-01-08T02-53-34.794924.parquet"]}]}]} | 2024-01-08T02:55:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of sreeramajay/TinyLlama-1.1B-orca-v1.0
Dataset automatically created during the evaluation run of model sreeramajay/TinyLlama-1.1B-orca-v1.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-08T02:53:34.794924(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of sreeramajay/TinyLlama-1.1B-orca-v1.0\n\n\n\nDataset automatically created during the evaluation run of model sreeramajay/TinyLlama-1.1B-orca-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T02:53:34.794924(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of sreeramajay/TinyLlama-1.1B-orca-v1.0\n\n\n\nDataset automatically created during the evaluation run of model sreeramajay/TinyLlama-1.1B-orca-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T02:53:34.794924(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
199,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of sreeramajay/TinyLlama-1.1B-orca-v1.0\n\n\n\nDataset automatically created during the evaluation run of model sreeramajay/TinyLlama-1.1B-orca-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-08T02:53:34.794924(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
e3202d9311f4ad1a935b28ae23ff196a4c3d84e7 |
# Dataset Card for Evaluation run of aloobun/falcon-1b-cot-t2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [aloobun/falcon-1b-cot-t2](https://huggingface.co/aloobun/falcon-1b-cot-t2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aloobun__falcon-1b-cot-t2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T06:26:29.357404](https://huggingface.co/datasets/open-llm-leaderboard/details_aloobun__falcon-1b-cot-t2/blob/main/results_2024-01-08T06-26-29.357404.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23189624407991713,
"acc_stderr": 0.029929135523085983,
"acc_norm": 0.23172050414924006,
"acc_norm_stderr": 0.03071719552245984,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931593,
"mc2": 0.48380742228126167,
"mc2_stderr": 0.016615885051114176
},
"harness|arc:challenge|25": {
"acc": 0.20819112627986347,
"acc_stderr": 0.011864866118448069,
"acc_norm": 0.24744027303754265,
"acc_norm_stderr": 0.01261035266329267
},
"harness|hellaswag|10": {
"acc": 0.25731925911173076,
"acc_stderr": 0.004362633637374482,
"acc_norm": 0.24746066520613424,
"acc_norm_stderr": 0.004306547156331409
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931593,
"mc2": 0.48380742228126167,
"mc2_stderr": 0.016615885051114176
},
"harness|winogrande|5": {
"acc": 0.5035516969218626,
"acc_stderr": 0.014052131146915852
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_aloobun__falcon-1b-cot-t2 | [
"region:us"
] | 2024-01-07T22:22:12+00:00 | {"pretty_name": "Evaluation run of aloobun/falcon-1b-cot-t2", "dataset_summary": "Dataset automatically created during the evaluation run of model [aloobun/falcon-1b-cot-t2](https://huggingface.co/aloobun/falcon-1b-cot-t2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aloobun__falcon-1b-cot-t2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-08T06:26:29.357404](https://huggingface.co/datasets/open-llm-leaderboard/details_aloobun__falcon-1b-cot-t2/blob/main/results_2024-01-08T06-26-29.357404.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23189624407991713,\n \"acc_stderr\": 0.029929135523085983,\n \"acc_norm\": 0.23172050414924006,\n \"acc_norm_stderr\": 0.03071719552245984,\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.014816195991931593,\n \"mc2\": 0.48380742228126167,\n \"mc2_stderr\": 0.016615885051114176\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.20819112627986347,\n \"acc_stderr\": 0.011864866118448069,\n \"acc_norm\": 0.24744027303754265,\n \"acc_norm_stderr\": 0.01261035266329267\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25731925911173076,\n \"acc_stderr\": 0.004362633637374482,\n \"acc_norm\": 0.24746066520613424,\n \"acc_norm_stderr\": 0.004306547156331409\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.014816195991931593,\n \"mc2\": 0.48380742228126167,\n \"mc2_stderr\": 0.016615885051114176\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5035516969218626,\n \"acc_stderr\": 0.014052131146915852\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/aloobun/falcon-1b-cot-t2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|arc:challenge|25_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|arc:challenge|25_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|gsm8k|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|gsm8k|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hellaswag|10_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hellaswag|10_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T22-20-28.857048.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T06-26-29.357404.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["**/details_harness|winogrande|5_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["**/details_harness|winogrande|5_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-08T06-26-29.357404.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T22_20_28.857048", "path": ["results_2024-01-07T22-20-28.857048.parquet"]}, {"split": "2024_01_08T06_26_29.357404", "path": ["results_2024-01-08T06-26-29.357404.parquet"]}, {"split": "latest", "path": ["results_2024-01-08T06-26-29.357404.parquet"]}]}]} | 2024-01-08T06:28:16+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of aloobun/falcon-1b-cot-t2
Dataset automatically created during the evaluation run of model aloobun/falcon-1b-cot-t2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-08T06:26:29.357404(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of aloobun/falcon-1b-cot-t2\n\n\n\nDataset automatically created during the evaluation run of model aloobun/falcon-1b-cot-t2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T06:26:29.357404(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of aloobun/falcon-1b-cot-t2\n\n\n\nDataset automatically created during the evaluation run of model aloobun/falcon-1b-cot-t2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T06:26:29.357404(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of aloobun/falcon-1b-cot-t2\n\n\n\nDataset automatically created during the evaluation run of model aloobun/falcon-1b-cot-t2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-08T06:26:29.357404(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
2459309a0b3d2326ab6a31358cb94d24e779581e |
*note: The dataset was fine, but the parquet bot appears to have messed it up somehow, if its not visible up there, look at dataset.jsonl*
Updates:
- Jan 7th 2024 - scraped all the worldbuilding stackexchange Q's, with 5+ rep, left with 18000 Q's.
- Jan 8th 2024 - incorporated 100mb more in roleplay and worldbuilding, the dataset now includes Pippa, Bluemoon, and RolePlayIO
- Jan 9th 2024 - More misc. worldbuilding data incorporated, the dataset is complete enough now
# Worldbuild
A dataset focused on worldbuilding and roleplay, mostly well-formatted, high quality data, in the markdown format. | VatsaDev/worldbuild | [
"task_categories:conversational",
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:en",
"license:mit",
"region:us"
] | 2024-01-07T22:24:40+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["conversational", "text-generation"], "pretty_name": "worldbuilding"} | 2024-01-09T19:25:34+00:00 | [] | [
"en"
] | TAGS
#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-English #license-mit #region-us
|
*note: The dataset was fine, but the parquet bot appears to have messed it up somehow, if its not visible up there, look at URL*
Updates:
- Jan 7th 2024 - scraped all the worldbuilding stackexchange Q's, with 5+ rep, left with 18000 Q's.
- Jan 8th 2024 - incorporated 100mb more in roleplay and worldbuilding, the dataset now includes Pippa, Bluemoon, and RolePlayIO
- Jan 9th 2024 - More misc. worldbuilding data incorporated, the dataset is complete enough now
# Worldbuild
A dataset focused on worldbuilding and roleplay, mostly well-formatted, high quality data, in the markdown format. | [
"# Worldbuild\n\nA dataset focused on worldbuilding and roleplay, mostly well-formatted, high quality data, in the markdown format."
] | [
"TAGS\n#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-English #license-mit #region-us \n",
"# Worldbuild\n\nA dataset focused on worldbuilding and roleplay, mostly well-formatted, high quality data, in the markdown format."
] | [
48,
30
] | [
"passage: TAGS\n#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-English #license-mit #region-us \n# Worldbuild\n\nA dataset focused on worldbuilding and roleplay, mostly well-formatted, high quality data, in the markdown format."
] |
370c4f92da386d1e48241e6e0d2c537a9076c220 |
# Dataset Card for Evaluation run of remyxai/localmentor_25K_3epochs_tinyllama
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [remyxai/localmentor_25K_3epochs_tinyllama](https://huggingface.co/remyxai/localmentor_25K_3epochs_tinyllama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_remyxai__localmentor_25K_3epochs_tinyllama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-07T22:25:15.681205](https://huggingface.co/datasets/open-llm-leaderboard/details_remyxai__localmentor_25K_3epochs_tinyllama/blob/main/results_2024-01-07T22-25-15.681205.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2554930079258233,
"acc_stderr": 0.030536114632474777,
"acc_norm": 0.2566564564092015,
"acc_norm_stderr": 0.03129471436685104,
"mc1": 0.2141982864137087,
"mc1_stderr": 0.014362148155690469,
"mc2": 0.3606525365860081,
"mc2_stderr": 0.013646263392146925
},
"harness|arc:challenge|25": {
"acc": 0.31399317406143346,
"acc_stderr": 0.013562691224726295,
"acc_norm": 0.34215017064846415,
"acc_norm_stderr": 0.013864152159177275
},
"harness|hellaswag|10": {
"acc": 0.44542919737104164,
"acc_stderr": 0.004959973514772512,
"acc_norm": 0.5901214897430791,
"acc_norm_stderr": 0.004908059353503847
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17037037037037037,
"acc_stderr": 0.03247781185995594,
"acc_norm": 0.17037037037037037,
"acc_norm_stderr": 0.03247781185995594
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.1513157894736842,
"acc_stderr": 0.029162631596843975,
"acc_norm": 0.1513157894736842,
"acc_norm_stderr": 0.029162631596843975
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.02737770662467071,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.02737770662467071
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.1791907514450867,
"acc_stderr": 0.029242513059063287,
"acc_norm": 0.1791907514450867,
"acc_norm_stderr": 0.029242513059063287
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02850485647051419,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02850485647051419
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.1793103448275862,
"acc_stderr": 0.03196766433373186,
"acc_norm": 0.1793103448275862,
"acc_norm_stderr": 0.03196766433373186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.02241804289111395,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.02241804289111395
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924316,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23225806451612904,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.23225806451612904,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.16748768472906403,
"acc_stderr": 0.0262730860475354,
"acc_norm": 0.16748768472906403,
"acc_norm_stderr": 0.0262730860475354
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.027479603010538787,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.027479603010538787
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178263,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178263
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2717948717948718,
"acc_stderr": 0.022556551010132368,
"acc_norm": 0.2717948717948718,
"acc_norm_stderr": 0.022556551010132368
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.025644108639267627,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.025644108639267627
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868973,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868973
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.033367670865679766,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.033367670865679766
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22201834862385322,
"acc_stderr": 0.017818849564796634,
"acc_norm": 0.22201834862385322,
"acc_norm_stderr": 0.017818849564796634
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.029312814153955914,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.029312814153955914
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34977578475336324,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.34977578475336324,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749472,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749472
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23627075351213284,
"acc_stderr": 0.015190473717037484,
"acc_norm": 0.23627075351213284,
"acc_norm_stderr": 0.015190473717037484
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.023176298203992012,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.023176298203992012
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574877,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574877
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912255,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912255
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3183279742765273,
"acc_stderr": 0.026457225067811032,
"acc_norm": 0.3183279742765273,
"acc_norm_stderr": 0.026457225067811032
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2716049382716049,
"acc_stderr": 0.024748624490537365,
"acc_norm": 0.2716049382716049,
"acc_norm_stderr": 0.024748624490537365
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22695035460992907,
"acc_stderr": 0.024987106365642983,
"acc_norm": 0.22695035460992907,
"acc_norm_stderr": 0.024987106365642983
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2470664928292047,
"acc_stderr": 0.011015752255279341,
"acc_norm": 0.2470664928292047,
"acc_norm_stderr": 0.011015752255279341
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3492647058823529,
"acc_stderr": 0.028959755196824866,
"acc_norm": 0.3492647058823529,
"acc_norm_stderr": 0.028959755196824866
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.19090909090909092,
"acc_stderr": 0.03764425585984927,
"acc_norm": 0.19090909090909092,
"acc_norm_stderr": 0.03764425585984927
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1673469387755102,
"acc_stderr": 0.023897144768914524,
"acc_norm": 0.1673469387755102,
"acc_norm_stderr": 0.023897144768914524
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3072289156626506,
"acc_stderr": 0.03591566797824662,
"acc_norm": 0.3072289156626506,
"acc_norm_stderr": 0.03591566797824662
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.034240429246915824,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.034240429246915824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2141982864137087,
"mc1_stderr": 0.014362148155690469,
"mc2": 0.3606525365860081,
"mc2_stderr": 0.013646263392146925
},
"harness|winogrande|5": {
"acc": 0.6045777426992897,
"acc_stderr": 0.013741678387545347
},
"harness|gsm8k|5": {
"acc": 0.01061410159211524,
"acc_stderr": 0.002822713322387704
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_remyxai__localmentor_25K_3epochs_tinyllama | [
"region:us"
] | 2024-01-07T22:27:08+00:00 | {"pretty_name": "Evaluation run of remyxai/localmentor_25K_3epochs_tinyllama", "dataset_summary": "Dataset automatically created during the evaluation run of model [remyxai/localmentor_25K_3epochs_tinyllama](https://huggingface.co/remyxai/localmentor_25K_3epochs_tinyllama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_remyxai__localmentor_25K_3epochs_tinyllama\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-07T22:25:15.681205](https://huggingface.co/datasets/open-llm-leaderboard/details_remyxai__localmentor_25K_3epochs_tinyllama/blob/main/results_2024-01-07T22-25-15.681205.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2554930079258233,\n \"acc_stderr\": 0.030536114632474777,\n \"acc_norm\": 0.2566564564092015,\n \"acc_norm_stderr\": 0.03129471436685104,\n \"mc1\": 0.2141982864137087,\n \"mc1_stderr\": 0.014362148155690469,\n \"mc2\": 0.3606525365860081,\n \"mc2_stderr\": 0.013646263392146925\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.31399317406143346,\n \"acc_stderr\": 0.013562691224726295,\n \"acc_norm\": 0.34215017064846415,\n \"acc_norm_stderr\": 0.013864152159177275\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44542919737104164,\n \"acc_stderr\": 0.004959973514772512,\n \"acc_norm\": 0.5901214897430791,\n \"acc_norm_stderr\": 0.004908059353503847\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17037037037037037,\n \"acc_stderr\": 0.03247781185995594,\n \"acc_norm\": 0.17037037037037037,\n \"acc_norm_stderr\": 0.03247781185995594\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.1513157894736842,\n \"acc_stderr\": 0.029162631596843975,\n \"acc_norm\": 0.1513157894736842,\n \"acc_norm_stderr\": 0.029162631596843975\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.02737770662467071,\n \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.02737770662467071\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.1791907514450867,\n \"acc_stderr\": 0.029242513059063287,\n \"acc_norm\": 0.1791907514450867,\n \"acc_norm_stderr\": 0.029242513059063287\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02850485647051419,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02850485647051419\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.1793103448275862,\n \"acc_stderr\": 0.03196766433373186,\n \"acc_norm\": 0.1793103448275862,\n \"acc_norm_stderr\": 0.03196766433373186\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.02241804289111395,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.02241804289111395\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.03852273364924316,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.03852273364924316\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23225806451612904,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.23225806451612904,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.16748768472906403,\n \"acc_stderr\": 0.0262730860475354,\n \"acc_norm\": 0.16748768472906403,\n \"acc_norm_stderr\": 0.0262730860475354\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.18181818181818182,\n \"acc_stderr\": 0.027479603010538787,\n \"acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.027479603010538787\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178263,\n \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178263\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2717948717948718,\n \"acc_stderr\": 0.022556551010132368,\n \"acc_norm\": 0.2717948717948718,\n \"acc_norm_stderr\": 0.022556551010132368\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.025644108639267627,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.025644108639267627\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868973,\n \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868973\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2119205298013245,\n \"acc_stderr\": 0.033367670865679766,\n \"acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.033367670865679766\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22201834862385322,\n \"acc_stderr\": 0.017818849564796634,\n \"acc_norm\": 0.22201834862385322,\n \"acc_norm_stderr\": 0.017818849564796634\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.029312814153955914,\n \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.029312814153955914\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34977578475336324,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.34977578475336324,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n \"acc_stderr\": 0.028911208802749472,\n \"acc_norm\": 0.26495726495726496,\n \"acc_norm_stderr\": 0.028911208802749472\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23627075351213284,\n \"acc_stderr\": 0.015190473717037484,\n \"acc_norm\": 0.23627075351213284,\n \"acc_norm_stderr\": 0.015190473717037484\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.023176298203992012,\n \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.023176298203992012\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574877,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574877\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912255,\n \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912255\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3183279742765273,\n \"acc_stderr\": 0.026457225067811032,\n \"acc_norm\": 0.3183279742765273,\n \"acc_norm_stderr\": 0.026457225067811032\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.024748624490537365,\n \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.024748624490537365\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.22695035460992907,\n \"acc_stderr\": 0.024987106365642983,\n \"acc_norm\": 0.22695035460992907,\n \"acc_norm_stderr\": 0.024987106365642983\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n \"acc_stderr\": 0.011015752255279341,\n \"acc_norm\": 0.2470664928292047,\n \"acc_norm_stderr\": 0.011015752255279341\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3492647058823529,\n \"acc_stderr\": 0.028959755196824866,\n \"acc_norm\": 0.3492647058823529,\n \"acc_norm_stderr\": 0.028959755196824866\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320657,\n \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320657\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.19090909090909092,\n \"acc_stderr\": 0.03764425585984927,\n \"acc_norm\": 0.19090909090909092,\n \"acc_norm_stderr\": 0.03764425585984927\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.1673469387755102,\n \"acc_stderr\": 0.023897144768914524,\n \"acc_norm\": 0.1673469387755102,\n \"acc_norm_stderr\": 0.023897144768914524\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3072289156626506,\n \"acc_stderr\": 0.03591566797824662,\n \"acc_norm\": 0.3072289156626506,\n \"acc_norm_stderr\": 0.03591566797824662\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.034240429246915824,\n \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.034240429246915824\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2141982864137087,\n \"mc1_stderr\": 0.014362148155690469,\n \"mc2\": 0.3606525365860081,\n \"mc2_stderr\": 0.013646263392146925\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6045777426992897,\n \"acc_stderr\": 0.013741678387545347\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01061410159211524,\n \"acc_stderr\": 0.002822713322387704\n }\n}\n```", "repo_url": "https://huggingface.co/remyxai/localmentor_25K_3epochs_tinyllama", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|arc:challenge|25_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|gsm8k|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hellaswag|10_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T22-25-15.681205.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["**/details_harness|winogrande|5_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-07T22-25-15.681205.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T22_25_15.681205", "path": ["results_2024-01-07T22-25-15.681205.parquet"]}, {"split": "latest", "path": ["results_2024-01-07T22-25-15.681205.parquet"]}]}]} | 2024-01-07T22:27:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of remyxai/localmentor_25K_3epochs_tinyllama
Dataset automatically created during the evaluation run of model remyxai/localmentor_25K_3epochs_tinyllama on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-07T22:25:15.681205(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of remyxai/localmentor_25K_3epochs_tinyllama\n\n\n\nDataset automatically created during the evaluation run of model remyxai/localmentor_25K_3epochs_tinyllama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T22:25:15.681205(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of remyxai/localmentor_25K_3epochs_tinyllama\n\n\n\nDataset automatically created during the evaluation run of model remyxai/localmentor_25K_3epochs_tinyllama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T22:25:15.681205(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
201,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of remyxai/localmentor_25K_3epochs_tinyllama\n\n\n\nDataset automatically created during the evaluation run of model remyxai/localmentor_25K_3epochs_tinyllama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-07T22:25:15.681205(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
007aeaa950081852c98cc21ae7c1ce81902eb9e9 |
# Dataset Card for Evaluation run of amu/zen
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [amu/zen](https://huggingface.co/amu/zen) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_amu__zen",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-07T22:25:34.311431](https://huggingface.co/datasets/open-llm-leaderboard/details_amu__zen/blob/main/results_2024-01-07T22-25-34.311431.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2334557833204235,
"acc_stderr": 0.029914501058823966,
"acc_norm": 0.23300142324124876,
"acc_norm_stderr": 0.030691715983121565,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.015298077509485085,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.23464163822525597,
"acc_stderr": 0.012383873560768671,
"acc_norm": 0.23976109215017063,
"acc_norm_stderr": 0.012476304127453954
},
"harness|hellaswag|10": {
"acc": 0.25323640709022105,
"acc_stderr": 0.004339764434219063,
"acc_norm": 0.2508464449312886,
"acc_norm_stderr": 0.004326143430360098
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.026341480371118352,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.026341480371118352
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.032424147574830975,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.032424147574830975
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.14705882352941177,
"acc_stderr": 0.03524068951567449,
"acc_norm": 0.14705882352941177,
"acc_norm_stderr": 0.03524068951567449
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.035058596825972656,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.035058596825972656
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.03664666337225256,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.03664666337225256
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22258064516129034,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.22258064516129034,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.18652849740932642,
"acc_stderr": 0.028112091210117467,
"acc_norm": 0.18652849740932642,
"acc_norm_stderr": 0.028112091210117467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.034454062719870546,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.034454062719870546
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2825112107623318,
"acc_stderr": 0.030216831011508752,
"acc_norm": 0.2825112107623318,
"acc_norm_stderr": 0.030216831011508752
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2547486033519553,
"acc_stderr": 0.014572650383409163,
"acc_norm": 0.2547486033519553,
"acc_norm_stderr": 0.014572650383409163
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.025646863097137904,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.025646863097137904
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2347266881028939,
"acc_stderr": 0.024071805887677048,
"acc_norm": 0.2347266881028939,
"acc_norm_stderr": 0.024071805887677048
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.25,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.25,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24673202614379086,
"acc_stderr": 0.0174408203674025,
"acc_norm": 0.24673202614379086,
"acc_norm_stderr": 0.0174408203674025
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03218093795602357,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03218093795602357
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.015298077509485085,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4964483030781373,
"acc_stderr": 0.014052131146915864
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_amu__zen | [
"region:us"
] | 2024-01-07T22:27:54+00:00 | {"pretty_name": "Evaluation run of amu/zen", "dataset_summary": "Dataset automatically created during the evaluation run of model [amu/zen](https://huggingface.co/amu/zen) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_amu__zen\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-07T22:25:34.311431](https://huggingface.co/datasets/open-llm-leaderboard/details_amu__zen/blob/main/results_2024-01-07T22-25-34.311431.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2334557833204235,\n \"acc_stderr\": 0.029914501058823966,\n \"acc_norm\": 0.23300142324124876,\n \"acc_norm_stderr\": 0.030691715983121565,\n \"mc1\": 0.25703794369645044,\n \"mc1_stderr\": 0.015298077509485085,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.23464163822525597,\n \"acc_stderr\": 0.012383873560768671,\n \"acc_norm\": 0.23976109215017063,\n \"acc_norm_stderr\": 0.012476304127453954\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25323640709022105,\n \"acc_stderr\": 0.004339764434219063,\n \"acc_norm\": 0.2508464449312886,\n \"acc_norm_stderr\": 0.004326143430360098\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.026341480371118352,\n \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.026341480371118352\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.032424147574830975,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.032424147574830975\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.14705882352941177,\n \"acc_stderr\": 0.03524068951567449,\n \"acc_norm\": 0.14705882352941177,\n \"acc_norm_stderr\": 0.03524068951567449\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.035058596825972656,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.035058596825972656\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.03664666337225256,\n \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.03664666337225256\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22258064516129034,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.22258064516129034,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.18652849740932642,\n \"acc_stderr\": 0.028112091210117467,\n \"acc_norm\": 0.18652849740932642,\n \"acc_norm_stderr\": 0.028112091210117467\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23178807947019867,\n \"acc_stderr\": 0.034454062719870546,\n \"acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.034454062719870546\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2825112107623318,\n \"acc_stderr\": 0.030216831011508752,\n \"acc_norm\": 0.2825112107623318,\n \"acc_norm_stderr\": 0.030216831011508752\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728742,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728742\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615624,\n \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615624\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2547486033519553,\n \"acc_stderr\": 0.014572650383409163,\n \"acc_norm\": 0.2547486033519553,\n \"acc_norm_stderr\": 0.014572650383409163\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.025646863097137904,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.025646863097137904\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2347266881028939,\n \"acc_stderr\": 0.024071805887677048,\n \"acc_norm\": 0.2347266881028939,\n \"acc_norm_stderr\": 0.024071805887677048\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.026303648393696036,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.026303648393696036\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24673202614379086,\n \"acc_stderr\": 0.0174408203674025,\n \"acc_norm\": 0.24673202614379086,\n \"acc_norm_stderr\": 0.0174408203674025\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03218093795602357,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03218093795602357\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n \"mc1_stderr\": 0.015298077509485085,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4964483030781373,\n \"acc_stderr\": 0.014052131146915864\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/amu/zen", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|arc:challenge|25_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|gsm8k|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hellaswag|10_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T22-25-34.311431.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["**/details_harness|winogrande|5_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-07T22-25-34.311431.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T22_25_34.311431", "path": ["results_2024-01-07T22-25-34.311431.parquet"]}, {"split": "latest", "path": ["results_2024-01-07T22-25-34.311431.parquet"]}]}]} | 2024-01-07T22:28:13+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of amu/zen
Dataset automatically created during the evaluation run of model amu/zen on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-07T22:25:34.311431(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of amu/zen\n\n\n\nDataset automatically created during the evaluation run of model amu/zen on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T22:25:34.311431(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of amu/zen\n\n\n\nDataset automatically created during the evaluation run of model amu/zen on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T22:25:34.311431(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
169,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of amu/zen\n\n\n\nDataset automatically created during the evaluation run of model amu/zen on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-07T22:25:34.311431(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
08f2789541834abdc574c2fd1f4712812b272eb7 | # Dataset Card for Weblate Translations
<!-- Provide a quick summary of the dataset. -->
A dataset containing strings from projects hosted on [Weblate](https://hosted.weblate.org) and their translations into other languages.
Please consider [donating](https://weblate.org/en/donate/) or [contributing](https://weblate.org/en/contribute/) to Weblate if you find this dataset useful.
To avoid rows with values like "None" and "N/A" being interpreted as missing values, pass the keep_default_na parameter like this:
```
from datasets import load_dataset
dataset = load_dataset("ayymen/Weblate-Translations", keep_default_na=False)
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** Each sentence pair in the dataset has a corresponding license in the "license" column. This license is the one specified in the component or project containing the sentence.
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
- Machine Translation
- Language Identification
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
- Sentence pairs with empty/missing elements were dropped.
- Identical pairs were dropped.
- Trailing whitespace was stripped.
- Rows were deduplicated based on all 3 columns including "license", on a config/subset/tsv file basis. Which means that a single config might contain two identical sentence pairs with different licenses. Or a different config/subset might contain the exact same row (most likely a different variant/dialect of the same language(s)).
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
Weblate users.
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | ayymen/Weblate-Translations | [
"task_categories:translation",
"task_categories:text2text-generation",
"annotations_creators:crowdsourced",
"size_categories:1M<n<10M",
"language:aa",
"language:ab",
"language:ace",
"language:ach",
"language:af",
"language:afh",
"language:aii",
"language:ain",
"language:ajp",
"language:ak",
"language:am",
"language:an",
"language:ang",
"language:anp",
"language:apc",
"language:ar",
"language:arn",
"language:ars",
"language:as",
"language:ast",
"language:ay",
"language:ayc",
"language:az",
"language:azb",
"language:ba",
"language:bar",
"language:bd",
"language:be",
"language:bem",
"language:ber",
"language:bg",
"language:bho",
"language:bm",
"language:bn",
"language:bo",
"language:bp",
"language:bqi",
"language:br",
"language:brx",
"language:bs",
"language:bul",
"language:by",
"language:ca",
"language:ce",
"language:ceb",
"language:ckb",
"language:cmn",
"language:cn",
"language:cnr",
"language:co",
"language:cr",
"language:crh",
"language:cs",
"language:csb",
"language:cv",
"language:cy",
"language:cz",
"language:da",
"language:de",
"language:dev",
"language:doi",
"language:dsb",
"language:dua",
"language:dum",
"language:dv",
"language:dz",
"language:eg",
"language:el",
"language:en",
"language:eng",
"language:enm",
"language:eo",
"language:es",
"language:et",
"language:eu",
"language:ext",
"language:fa",
"language:fi",
"language:fil",
"language:fo",
"language:fr",
"language:fra",
"language:frm",
"language:frp",
"language:frs",
"language:fu",
"language:fur",
"language:fy",
"language:ga",
"language:gb",
"language:gd",
"language:gl",
"language:glk",
"language:gmh",
"language:gn",
"language:gr",
"language:gsw",
"language:gu",
"language:guc",
"language:gug",
"language:gum",
"language:guw",
"language:gv",
"language:ha",
"language:haw",
"language:he",
"language:hi",
"language:hne",
"language:hr",
"language:hrx",
"language:hsb",
"language:ht",
"language:hu",
"language:hy",
"language:hz",
"language:ia",
"language:id",
"language:ie",
"language:ig",
"language:in",
"language:io",
"language:is",
"language:it",
"language:iw",
"language:ja",
"language:jam",
"language:jbo",
"language:ji",
"language:jp",
"language:jpn",
"language:jv",
"language:ka",
"language:kab",
"language:kg",
"language:kk",
"language:kl",
"language:km",
"language:kmr",
"language:kn",
"language:ko",
"language:kok",
"language:kr",
"language:krl",
"language:ks",
"language:ksh",
"language:ku",
"language:kw",
"language:ky",
"language:la",
"language:lb",
"language:lfn",
"language:lg",
"language:li",
"language:lk",
"language:ln",
"language:lo",
"language:lt",
"language:ltg",
"language:lv",
"language:lzh",
"language:mai",
"language:me",
"language:mg",
"language:mhr",
"language:mi",
"language:mjw",
"language:mk",
"language:ml",
"language:mn",
"language:mnc",
"language:mni",
"language:mnw",
"language:mo",
"language:mr",
"language:ms",
"language:mt",
"language:my",
"language:na",
"language:nah",
"language:nan",
"language:nap",
"language:nb",
"language:nds",
"language:ne",
"language:nl",
"language:nn",
"language:no",
"language:np",
"language:nqo",
"language:ny",
"language:oc",
"language:oj",
"language:om",
"language:or",
"language:os",
"language:ota",
"language:pa",
"language:pam",
"language:pap",
"language:pbb",
"language:peo",
"language:pk",
"language:pl",
"language:pms",
"language:pr",
"language:prg",
"language:ps",
"language:pt",
"language:pu",
"language:qt",
"language:rcf",
"language:rm",
"language:ro",
"language:rom",
"language:ru",
"language:rue",
"language:rw",
"language:ryu",
"language:sa",
"language:sah",
"language:sai",
"language:sat",
"language:sc",
"language:sco",
"language:sd",
"language:sdh",
"language:se",
"language:sh",
"language:shn",
"language:si",
"language:sk",
"language:skr",
"language:sl",
"language:sm",
"language:sma",
"language:sn",
"language:so",
"language:sq",
"language:sr",
"language:st",
"language:su",
"language:sv",
"language:sw",
"language:szl",
"language:ta",
"language:tam",
"language:te",
"language:tet",
"language:tg",
"language:th",
"language:ti",
"language:tk",
"language:tl",
"language:tlh",
"language:tn",
"language:to",
"language:tok",
"language:tr",
"language:trv",
"language:tt",
"language:tum",
"language:tw",
"language:ty",
"language:tzm",
"language:ua",
"language:udm",
"language:ug",
"language:uk",
"language:und",
"language:ur",
"language:us",
"language:uz",
"language:vec",
"language:vi",
"language:vls",
"language:wa",
"language:wae",
"language:wo",
"language:xh",
"language:yi",
"language:yo",
"language:yue",
"language:zgh",
"language:zh",
"language:zu",
"region:us"
] | 2024-01-07T22:29:01+00:00 | {"annotations_creators": ["crowdsourced"], "language": ["aa", "ab", "ace", "ach", "af", "afh", "aii", "ain", "ajp", "ak", "am", "an", "ang", "anp", "apc", "ar", "arn", "ars", "as", "ast", "ay", "ayc", "az", "azb", "ba", "bar", "bd", "be", "bem", "ber", "bg", "bho", "bm", "bn", "bo", "bp", "bqi", "br", "brx", "bs", "bul", "by", "ca", "ce", "ceb", "ckb", "cmn", "cn", "cnr", "co", "cr", "crh", "cs", "csb", "cv", "cy", "cz", "da", "de", "dev", "doi", "dsb", "dua", "dum", "dv", "dz", "eg", "el", "en", "eng", "enm", "eo", "es", "et", "eu", "ext", "fa", "fi", "fil", "fo", "fr", "fra", "frm", "frp", "frs", "fu", "fur", "fy", "ga", "gb", "gd", "gl", "glk", "gmh", "gn", "gr", "gsw", "gu", "guc", "gug", "gum", "guw", "gv", "ha", "haw", "he", "hi", "hne", "hr", "hrx", "hsb", "ht", "hu", "hy", "hz", "ia", "id", "ie", "ig", "in", "io", "is", "it", "iw", "ja", "jam", "jbo", "ji", "jp", "jpn", "jv", "ka", "kab", "kg", "kk", "kl", "km", "kmr", "kn", "ko", "kok", "kr", "krl", "ks", "ksh", "ku", "kw", "ky", "la", "lb", "lfn", "lg", "li", "lk", "ln", "lo", "lt", "ltg", "lv", "lzh", "mai", "me", "mg", "mhr", "mi", "mjw", "mk", "ml", "mn", "mnc", "mni", "mnw", "mo", "mr", "ms", "mt", "my", "na", "nah", "nan", "nap", "nb", "nds", "ne", "nl", "nn", "no", "np", "nqo", "ny", "oc", "oj", "om", "or", "os", "ota", "pa", "pam", "pap", "pbb", "peo", "pk", "pl", "pms", "pr", "prg", "ps", "pt", "pu", "qt", "rcf", "rm", "ro", "rom", "ru", "rue", "rw", "ryu", "sa", "sah", "sai", "sat", "sc", "sco", "sd", "sdh", "se", "sh", "shn", "si", "sk", "skr", "sl", "sm", "sma", "sn", "so", "sq", "sr", "st", "su", "sv", "sw", "szl", "ta", "tam", "te", "tet", "tg", "th", "ti", "tk", "tl", "tlh", "tn", "to", "tok", "tr", "trv", "tt", "tum", "tw", "ty", "tzm", "ua", "udm", "ug", "uk", "und", "ur", "us", "uz", "vec", "vi", "vls", "wa", "wae", "wo", "xh", "yi", "yo", "yue", "zgh", "zh", "zu"], "size_categories": ["1M<n<10M"], "task_categories": ["translation", "text2text-generation"], "pretty_name": "Weblate Translations", "configs": [{"config_name": "en-lk", "data_files": "en-lk.tsv"}, {"config_name": "en-en-rAU", "data_files": "en-en-rAU.tsv"}, {"config_name": "en-hy-rAM", "data_files": "en-hy-rAM.tsv"}, {"config_name": "en-qt", "data_files": "en-qt.tsv"}, {"config_name": "en-se", "data_files": "en-se.tsv"}, {"config_name": "en-en_AU", "data_files": "en-en_AU.tsv"}, {"config_name": "en-in", "data_files": "en-in.tsv"}, {"config_name": "en_US-id", "data_files": "en_US-id.tsv"}, {"config_name": "en-ajp", "data_files": "en-ajp.tsv"}, {"config_name": "en-en_US_rude", "data_files": "en-en_US_rude.tsv"}, {"config_name": "en_GB-sw", "data_files": "en_GB-sw.tsv"}, {"config_name": "en_GB-tzm", "data_files": "en_GB-tzm.tsv"}, {"config_name": "dev-pt", "data_files": "dev-pt.tsv"}, {"config_name": "de-nb_NO", "data_files": "de-nb_NO.tsv"}, {"config_name": "en_devel-bn_BD", "data_files": "en_devel-bn_BD.tsv"}, {"config_name": "messages-fr", "data_files": "messages-fr.tsv"}, {"config_name": "en-de-CH", "data_files": "en-de-CH.tsv"}, {"config_name": "en-gu_IN", "data_files": "en-gu_IN.tsv"}, {"config_name": "en-be_BY", "data_files": "en-be_BY.tsv"}, {"config_name": "eo-sk", "data_files": "eo-sk.tsv"}, {"config_name": "en-brx", "data_files": "en-brx.tsv"}, {"config_name": "en-en_US", "data_files": "en-en_US.tsv"}, {"config_name": "en_GB-an", "data_files": "en_GB-an.tsv"}, {"config_name": "en-korean", "data_files": "en-korean.tsv"}, {"config_name": "en_GB-fr-FR", "data_files": "en_GB-fr-FR.tsv"}, {"config_name": "en_devel-si", "data_files": "en_devel-si.tsv"}, {"config_name": "en_US-sr_Cyrl", "data_files": "en_US-sr_Cyrl.tsv"}, {"config_name": "en-fr@formal", "data_files": "[email protected]"}, {"config_name": "en_devel-zh_tw", "data_files": "en_devel-zh_tw.tsv"}, {"config_name": "en-en_ud", "data_files": "en-en_ud.tsv"}, {"config_name": "en_GB-bi", "data_files": "en_GB-bi.tsv"}, {"config_name": "en-sq_AL", "data_files": "en-sq_AL.tsv"}, {"config_name": "en-README_zh-CN", "data_files": "en-README_zh-CN.tsv"}, {"config_name": "en_US-ml_IN", "data_files": "en_US-ml_IN.tsv"}, {"config_name": "nb_NO-nn", "data_files": "nb_NO-nn.tsv"}, {"config_name": "en_devel-es_419", "data_files": "en_devel-es_419.tsv"}, {"config_name": "en-de-DE", "data_files": "en-de-DE.tsv"}, {"config_name": "en-dua", "data_files": "en-dua.tsv"}, {"config_name": "en-gu-rIN", "data_files": "en-gu-rIN.tsv"}, {"config_name": "en-ty", "data_files": "en-ty.tsv"}, {"config_name": "nl-pl", "data_files": "nl-pl.tsv"}, {"config_name": "en_US-bo", "data_files": "en_US-bo.tsv"}, {"config_name": "en_devel-ru_RU", "data_files": "en_devel-ru_RU.tsv"}, {"config_name": "en_GB-cy_GB", "data_files": "en_GB-cy_GB.tsv"}, {"config_name": "en_US-zh-TW", "data_files": "en_US-zh-TW.tsv"}, {"config_name": "en_US-zh-hk", "data_files": "en_US-zh-hk.tsv"}, {"config_name": "en-DE", "data_files": "en-DE.tsv"}, {"config_name": "en_US-lzh", "data_files": "en_US-lzh.tsv"}, {"config_name": "sv-sma", "data_files": "sv-sma.tsv"}, {"config_name": "en_GB-fi_FI", "data_files": "en_GB-fi_FI.tsv"}, {"config_name": "en_US-zu", "data_files": "en_US-zu.tsv"}, {"config_name": "en_devel-mr", "data_files": "en_devel-mr.tsv"}, {"config_name": "en_US-he-IL", "data_files": "en_US-he-IL.tsv"}, {"config_name": "en_GB-fur", "data_files": "en_GB-fur.tsv"}, {"config_name": "en-fr_CH", "data_files": "en-fr_CH.tsv"}, {"config_name": "en-en-CA", "data_files": "en-en-CA.tsv"}, {"config_name": "en-ro_MD", "data_files": "en-ro_MD.tsv"}, {"config_name": "en_US-yue_HK", "data_files": "en_US-yue_HK.tsv"}, {"config_name": "es-mr", "data_files": "es-mr.tsv"}, {"config_name": "en_GB-ace", "data_files": "en_GB-ace.tsv"}, {"config_name": "en_GB-lt", "data_files": "en_GB-lt.tsv"}, {"config_name": "en-es-rES", "data_files": "en-es-rES.tsv"}, {"config_name": "en-ksh", "data_files": "en-ksh.tsv"}, {"config_name": "en_GB-ti", "data_files": "en_GB-ti.tsv"}, {"config_name": "en-zh-rSG", "data_files": "en-zh-rSG.tsv"}, {"config_name": "en-ms_Arab", "data_files": "en-ms_Arab.tsv"}, {"config_name": "en-README_CZ", "data_files": "en-README_CZ.tsv"}, {"config_name": "en-ug-CN", "data_files": "en-ug-CN.tsv"}, {"config_name": "en-ar-rYE", "data_files": "en-ar-rYE.tsv"}, {"config_name": "en-pk", "data_files": "en-pk.tsv"}, {"config_name": "en_US-pt", "data_files": "en_US-pt.tsv"}, {"config_name": "en_devel-pt-br", "data_files": "en_devel-pt-br.tsv"}, {"config_name": "en-de_formal", "data_files": "en-de_formal.tsv"}, {"config_name": "en-zh_TW", "data_files": "en-zh_TW.tsv"}, {"config_name": "en-hu-rHU", "data_files": "en-hu-rHU.tsv"}, {"config_name": "en-lv-LV", "data_files": "en-lv-LV.tsv"}, {"config_name": "en-hr_HR", "data_files": "en-hr_HR.tsv"}, {"config_name": "en-en_devel", "data_files": "en-en_devel.tsv"}, {"config_name": "en-ka", "data_files": "en-ka.tsv"}, {"config_name": "en_GB-da_DK", "data_files": "en_GB-da_DK.tsv"}, {"config_name": "en-ar-AR", "data_files": "en-ar-AR.tsv"}, {"config_name": "en-om", "data_files": "en-om.tsv"}, {"config_name": "en_US-id-ID", "data_files": "en_US-id-ID.tsv"}, {"config_name": "en-cs_CZ", "data_files": "en-cs_CZ.tsv"}, {"config_name": "it-es_ES", "data_files": "it-es_ES.tsv"}, {"config_name": "en-zh_HK", "data_files": "en-zh_HK.tsv"}, {"config_name": "dev-ko", "data_files": "dev-ko.tsv"}, {"config_name": "en-cr", "data_files": "en-cr.tsv"}, {"config_name": "en-sr_Cyrl", "data_files": "en-sr_Cyrl.tsv"}, {"config_name": "en-nl_BE", "data_files": "en-nl_BE.tsv"}, {"config_name": "en_GB-zh-rTW", "data_files": "en_GB-zh-rTW.tsv"}, {"config_name": "en-da-DK", "data_files": "en-da-DK.tsv"}, {"config_name": "en-ang", "data_files": "en-ang.tsv"}, {"config_name": "en-ur-IN", "data_files": "en-ur-IN.tsv"}, {"config_name": "en-HU", "data_files": "en-HU.tsv"}, {"config_name": "en-kw", "data_files": "en-kw.tsv"}, {"config_name": "en_GB-fo", "data_files": "en_GB-fo.tsv"}, {"config_name": "en-sr-SP", "data_files": "en-sr-SP.tsv"}, {"config_name": "en-pl", "data_files": "en-pl.tsv"}, {"config_name": "en-or", "data_files": "en-or.tsv"}, {"config_name": "en-en-gb", "data_files": "en-en-gb.tsv"}, {"config_name": "en-en", "data_files": "en-en.tsv"}, {"config_name": "en_GB-fa_IR", "data_files": "en_GB-fa_IR.tsv"}, {"config_name": "en-bn-IN", "data_files": "en-bn-IN.tsv"}, {"config_name": "en-pl_pl", "data_files": "en-pl_pl.tsv"}, {"config_name": "en_US-ro_RO", "data_files": "en_US-ro_RO.tsv"}, {"config_name": "en-es_mx", "data_files": "en-es_mx.tsv"}, {"config_name": "en-kk_KZ", "data_files": "en-kk_KZ.tsv"}, {"config_name": "en-ab", "data_files": "en-ab.tsv"}, {"config_name": "en_UK-de_DE", "data_files": "en_UK-de_DE.tsv"}, {"config_name": "eo-de", "data_files": "eo-de.tsv"}, {"config_name": "en_US-fil", "data_files": "en_US-fil.tsv"}, {"config_name": "en-bp", "data_files": "en-bp.tsv"}, {"config_name": "en-ta_IN", "data_files": "en-ta_IN.tsv"}, {"config_name": "en-round", "data_files": "en-round.tsv"}, {"config_name": "en-gd", "data_files": "en-gd.tsv"}, {"config_name": "en_US-en@uwu", "data_files": "[email protected]"}, {"config_name": "en-dum", "data_files": "en-dum.tsv"}, {"config_name": "en-ja_JP", "data_files": "en-ja_JP.tsv"}, {"config_name": "en-ryu", "data_files": "en-ryu.tsv"}, {"config_name": "en-b+en+001", "data_files": "en-b+en+001.tsv"}, {"config_name": "en-en-US", "data_files": "en-en-US.tsv"}, {"config_name": "en-sl_SI", "data_files": "en-sl_SI.tsv"}, {"config_name": "de-it", "data_files": "de-it.tsv"}, {"config_name": "en_GB-sr_RS", "data_files": "en_GB-sr_RS.tsv"}, {"config_name": "en_US-da", "data_files": "en_US-da.tsv"}, {"config_name": "en_GB-tk", "data_files": "en_GB-tk.tsv"}, {"config_name": "en-bn", "data_files": "en-bn.tsv"}, {"config_name": "en_devel-es_bo", "data_files": "en_devel-es_bo.tsv"}, {"config_name": "en-ja_CARES", "data_files": "en-ja_CARES.tsv"}, {"config_name": "en-km-KH", "data_files": "en-km-KH.tsv"}, {"config_name": "en_US-de_DE", "data_files": "en_US-de_DE.tsv"}, {"config_name": "en_US-hu_HU", "data_files": "en_US-hu_HU.tsv"}, {"config_name": "en-ta-rIN", "data_files": "en-ta-rIN.tsv"}, {"config_name": "en_US-ml", "data_files": "en_US-ml.tsv"}, {"config_name": "en-sr_RS", "data_files": "en-sr_RS.tsv"}, {"config_name": "en_US-eu", "data_files": "en_US-eu.tsv"}, {"config_name": "pl-es", "data_files": "pl-es.tsv"}, {"config_name": "en_US-ka", "data_files": "en_US-ka.tsv"}, {"config_name": "en-bulgarian", "data_files": "en-bulgarian.tsv"}, {"config_name": "fr-en", "data_files": "fr-en.tsv"}, {"config_name": "en_devel-nb-rNO", "data_files": "en_devel-nb-rNO.tsv"}, {"config_name": "en_GB-ce", "data_files": "en_GB-ce.tsv"}, {"config_name": "en_US-bs", "data_files": "en_US-bs.tsv"}, {"config_name": "en-en@uwu", "data_files": "[email protected]"}, {"config_name": "en_GB-nn", "data_files": "en_GB-nn.tsv"}, {"config_name": "en-pa_PK", "data_files": "en-pa_PK.tsv"}, {"config_name": "en-wae", "data_files": "en-wae.tsv"}, {"config_name": "en-ar_EG", "data_files": "en-ar_EG.tsv"}, {"config_name": "en_GB-lt_LT", "data_files": "en_GB-lt_LT.tsv"}, {"config_name": "en-zh-Hant-HK", "data_files": "en-zh-Hant-HK.tsv"}, {"config_name": "messages-de", "data_files": "messages-de.tsv"}, {"config_name": "en-ur_IN", "data_files": "en-ur_IN.tsv"}, {"config_name": "en-in-rID", "data_files": "en-in-rID.tsv"}, {"config_name": "en-lo-LA", "data_files": "en-lo-LA.tsv"}, {"config_name": "en-el-rGR", "data_files": "en-el-rGR.tsv"}, {"config_name": "en-es-ES", "data_files": "en-es-ES.tsv"}, {"config_name": "en_devel-et", "data_files": "en_devel-et.tsv"}, {"config_name": "en-fr-rCH", "data_files": "en-fr-rCH.tsv"}, {"config_name": "en-en_CA", "data_files": "en-en_CA.tsv"}, {"config_name": "en-b+uz+Latn", "data_files": "en-b+uz+Latn.tsv"}, {"config_name": "en_GB-tig", "data_files": "en_GB-tig.tsv"}, {"config_name": "en_GB-hi_IN", "data_files": "en_GB-hi_IN.tsv"}, {"config_name": "de-pl", "data_files": "de-pl.tsv"}, {"config_name": "en-zh-rCN", "data_files": "en-zh-rCN.tsv"}, {"config_name": "en-hi-rIN", "data_files": "en-hi-rIN.tsv"}, {"config_name": "en-ba", "data_files": "en-ba.tsv"}, {"config_name": "en-fy", "data_files": "en-fy.tsv"}, {"config_name": "en-el-GR", "data_files": "en-el-GR.tsv"}, {"config_name": "en-tum", "data_files": "en-tum.tsv"}, {"config_name": "en-ru-RU", "data_files": "en-ru-RU.tsv"}, {"config_name": "en_US-fa", "data_files": "en_US-fa.tsv"}, {"config_name": "en_GB-ka", "data_files": "en_GB-ka.tsv"}, {"config_name": "es-nb-rNO", "data_files": "es-nb-rNO.tsv"}, {"config_name": "en_US-ckb", "data_files": "en_US-ckb.tsv"}, {"config_name": "en-hi_IN", "data_files": "en-hi_IN.tsv"}, {"config_name": "eo-pa", "data_files": "eo-pa.tsv"}, {"config_name": "en_devel-zh_TW", "data_files": "en_devel-zh_TW.tsv"}, {"config_name": "en_GB-ch", "data_files": "en_GB-ch.tsv"}, {"config_name": "en-sdh", "data_files": "en-sdh.tsv"}, {"config_name": "en-lzh", "data_files": "en-lzh.tsv"}, {"config_name": "en-zh_HANS-CN", "data_files": "en-zh_HANS-CN.tsv"}, {"config_name": "en-li", "data_files": "en-li.tsv"}, {"config_name": "en_devel-zh_cn", "data_files": "en_devel-zh_cn.tsv"}, {"config_name": "en_GB-mk", "data_files": "en_GB-mk.tsv"}, {"config_name": "en_GB-ay", "data_files": "en_GB-ay.tsv"}, {"config_name": "en-sq-rAL", "data_files": "en-sq-rAL.tsv"}, {"config_name": "en-nl_TND", "data_files": "en-nl_TND.tsv"}, {"config_name": "en-th", "data_files": "en-th.tsv"}, {"config_name": "messages-id", "data_files": "messages-id.tsv"}, {"config_name": "en-bo", "data_files": "en-bo.tsv"}, {"config_name": "en-hy", "data_files": "en-hy.tsv"}, {"config_name": "en_US-gd", "data_files": "en_US-gd.tsv"}, {"config_name": "en-tok", "data_files": "en-tok.tsv"}, {"config_name": "pt_BR-en", "data_files": "pt_BR-en.tsv"}, {"config_name": "fr-pt", "data_files": "fr-pt.tsv"}, {"config_name": "en-bs-rBA", "data_files": "en-bs-rBA.tsv"}, {"config_name": "en-zh-hant", "data_files": "en-zh-hant.tsv"}, {"config_name": "en_US-fr", "data_files": "en_US-fr.tsv"}, {"config_name": "en-eu-ES", "data_files": "en-eu-ES.tsv"}, {"config_name": "en-lv_LV", "data_files": "en-lv_LV.tsv"}, {"config_name": "und-fr", "data_files": "und-fr.tsv"}, {"config_name": "en-af-rZA", "data_files": "en-af-rZA.tsv"}, {"config_name": "en-da", "data_files": "en-da.tsv"}, {"config_name": "en-os", "data_files": "en-os.tsv"}, {"config_name": "en-fr-CH", "data_files": "en-fr-CH.tsv"}, {"config_name": "en-es_MX", "data_files": "en-es_MX.tsv"}, {"config_name": "nl-bg", "data_files": "nl-bg.tsv"}, {"config_name": "en_GB-ckb", "data_files": "en_GB-ckb.tsv"}, {"config_name": "en-ar-rEG", "data_files": "en-ar-rEG.tsv"}, {"config_name": "en_US-mr", "data_files": "en_US-mr.tsv"}, {"config_name": "en_US-cs-CZ", "data_files": "en_US-cs-CZ.tsv"}, {"config_name": "en_devel-fi", "data_files": "en_devel-fi.tsv"}, {"config_name": "en-mhr", "data_files": "en-mhr.tsv"}, {"config_name": "en-no-rNO", "data_files": "en-no-rNO.tsv"}, {"config_name": "en-it_it", "data_files": "en-it_it.tsv"}, {"config_name": "en-ar-rSA", "data_files": "en-ar-rSA.tsv"}, {"config_name": "en_GB-nso", "data_files": "en_GB-nso.tsv"}, {"config_name": "en-ti", "data_files": "en-ti.tsv"}, {"config_name": "en-iw_HE", "data_files": "en-iw_HE.tsv"}, {"config_name": "en-szl", "data_files": "en-szl.tsv"}, {"config_name": "en_GB-ba", "data_files": "en_GB-ba.tsv"}, {"config_name": "en_devel-cs", "data_files": "en_devel-cs.tsv"}, {"config_name": "en_GB-pl_PL", "data_files": "en_GB-pl_PL.tsv"}, {"config_name": "en-ta_LK", "data_files": "en-ta_LK.tsv"}, {"config_name": "en-uz@latin", "data_files": "[email protected]"}, {"config_name": "en-el", "data_files": "en-el.tsv"}, {"config_name": "en_GB-cs", "data_files": "en_GB-cs.tsv"}, {"config_name": "en-bul_BG", "data_files": "en-bul_BG.tsv"}, {"config_name": "en-fa_IR", "data_files": "en-fa_IR.tsv"}, {"config_name": "en-gsw", "data_files": "en-gsw.tsv"}, {"config_name": "en-ko-KR", "data_files": "en-ko-KR.tsv"}, {"config_name": "en-bs_BA", "data_files": "en-bs_BA.tsv"}, {"config_name": "en_GB-wo", "data_files": "en_GB-wo.tsv"}, {"config_name": "en_devel-it", "data_files": "en_devel-it.tsv"}, {"config_name": "en_US-bn", "data_files": "en_US-bn.tsv"}, {"config_name": "en_devel-pl", "data_files": "en_devel-pl.tsv"}, {"config_name": "en-rm", "data_files": "en-rm.tsv"}, {"config_name": "en-night", "data_files": "en-night.tsv"}, {"config_name": "eo-ca", "data_files": "eo-ca.tsv"}, {"config_name": "en_US-ps", "data_files": "en_US-ps.tsv"}, {"config_name": "en_GB-sd", "data_files": "en_GB-sd.tsv"}, {"config_name": "en-th-TH", "data_files": "en-th-TH.tsv"}, {"config_name": "en-sv-rSE", "data_files": "en-sv-rSE.tsv"}, {"config_name": "en-b+zh+Hans", "data_files": "en-b+zh+Hans.tsv"}, {"config_name": "en_devel-uk", "data_files": "en_devel-uk.tsv"}, {"config_name": "en_US-it_IT", "data_files": "en_US-it_IT.tsv"}, {"config_name": "en-b+hrx", "data_files": "en-b+hrx.tsv"}, {"config_name": "en-my", "data_files": "en-my.tsv"}, {"config_name": "en_GB-sc", "data_files": "en_GB-sc.tsv"}, {"config_name": "en-de_DE_rude", "data_files": "en-de_DE_rude.tsv"}, {"config_name": "en_GB-ff", "data_files": "en_GB-ff.tsv"}, {"config_name": "en_devel-nl", "data_files": "en_devel-nl.tsv"}, {"config_name": "en-shn", "data_files": "en-shn.tsv"}, {"config_name": "en_GB-ca", "data_files": "en_GB-ca.tsv"}, {"config_name": "en-hu_HU", "data_files": "en-hu_HU.tsv"}, {"config_name": "ru-be", "data_files": "ru-be.tsv"}, {"config_name": "es-ml", "data_files": "es-ml.tsv"}, {"config_name": "en_GB-na", "data_files": "en_GB-na.tsv"}, {"config_name": "en_devel-ja", "data_files": "en_devel-ja.tsv"}, {"config_name": "en-pt-rPT-v26", "data_files": "en-pt-rPT-v26.tsv"}, {"config_name": "en_devel-pt_BR", "data_files": "en_devel-pt_BR.tsv"}, {"config_name": "en_US-ar_AA", "data_files": "en_US-ar_AA.tsv"}, {"config_name": "en_US-en_GB", "data_files": "en_US-en_GB.tsv"}, {"config_name": "en-de_FORM", "data_files": "en-de_FORM.tsv"}, {"config_name": "en_US-et", "data_files": "en_US-et.tsv"}, {"config_name": "pl-it", "data_files": "pl-it.tsv"}, {"config_name": "messages-ru", "data_files": "messages-ru.tsv"}, {"config_name": "en_devel-en", "data_files": "en_devel-en.tsv"}, {"config_name": "en-te_IN", "data_files": "en-te_IN.tsv"}, {"config_name": "en_US-it-IT", "data_files": "en_US-it-IT.tsv"}, {"config_name": "en-zh-rMO", "data_files": "en-zh-rMO.tsv"}, {"config_name": "en-fy-NL", "data_files": "en-fy-NL.tsv"}, {"config_name": "en-iw-rIL", "data_files": "en-iw-rIL.tsv"}, {"config_name": "en-zh-Hant", "data_files": "en-zh-Hant.tsv"}, {"config_name": "en-es_uy", "data_files": "en-es_uy.tsv"}, {"config_name": "en_GB-or", "data_files": "en_GB-or.tsv"}, {"config_name": "en-tt", "data_files": "en-tt.tsv"}, {"config_name": "de-pt", "data_files": "de-pt.tsv"}, {"config_name": "en-zh-Hans", "data_files": "en-zh-Hans.tsv"}, {"config_name": "en-ar-TN", "data_files": "en-ar-TN.tsv"}, {"config_name": "en_US-si_LK", "data_files": "en_US-si_LK.tsv"}, {"config_name": "en-so", "data_files": "en-so.tsv"}, {"config_name": "en_GB-csb", "data_files": "en_GB-csb.tsv"}, {"config_name": "en-fr-CA", "data_files": "en-fr-CA.tsv"}, {"config_name": "en-es_BO", "data_files": "en-es_BO.tsv"}, {"config_name": "en_devel-es_pa", "data_files": "en_devel-es_pa.tsv"}, {"config_name": "en-vi-VN", "data_files": "en-vi-VN.tsv"}, {"config_name": "en_devel-sw", "data_files": "en_devel-sw.tsv"}, {"config_name": "en-es-rMX", "data_files": "en-es-rMX.tsv"}, {"config_name": "en-eu-rES", "data_files": "en-eu-rES.tsv"}, {"config_name": "en_GB-pi", "data_files": "en_GB-pi.tsv"}, {"config_name": "en_devel-bg", "data_files": "en_devel-bg.tsv"}, {"config_name": "en-ja-JP", "data_files": "en-ja-JP.tsv"}, {"config_name": "en_US-uk", "data_files": "en_US-uk.tsv"}, {"config_name": "en_GB-km", "data_files": "en_GB-km.tsv"}, {"config_name": "en_US-ko", "data_files": "en_US-ko.tsv"}, {"config_name": "en-gmh", "data_files": "en-gmh.tsv"}, {"config_name": "en_US-hy", "data_files": "en_US-hy.tsv"}, {"config_name": "en_GB-ml", "data_files": "en_GB-ml.tsv"}, {"config_name": "en-bn-rIN", "data_files": "en-bn-rIN.tsv"}, {"config_name": "en-ach", "data_files": "en-ach.tsv"}, {"config_name": "en-pt-rBR-v26", "data_files": "en-pt-rBR-v26.tsv"}, {"config_name": "en_US-zh", "data_files": "en_US-zh.tsv"}, {"config_name": "en-sw-rKE", "data_files": "en-sw-rKE.tsv"}, {"config_name": "en_GB-ha", "data_files": "en_GB-ha.tsv"}, {"config_name": "en-en-rGB", "data_files": "en-en-rGB.tsv"}, {"config_name": "en_devel-pt", "data_files": "en_devel-pt.tsv"}, {"config_name": "en-no_NB", "data_files": "en-no_NB.tsv"}, {"config_name": "en-no_NO", "data_files": "en-no_NO.tsv"}, {"config_name": "en-es_es", "data_files": "en-es_es.tsv"}, {"config_name": "en-kk", "data_files": "en-kk.tsv"}, {"config_name": "en-bm", "data_files": "en-bm.tsv"}, {"config_name": "en-pl-PL", "data_files": "en-pl-PL.tsv"}, {"config_name": "en_GB-id", "data_files": "en_GB-id.tsv"}, {"config_name": "en-sr-Latn", "data_files": "en-sr-Latn.tsv"}, {"config_name": "en_US-ms", "data_files": "en_US-ms.tsv"}, {"config_name": "en-et_ET", "data_files": "en-et_ET.tsv"}, {"config_name": "en-b+es+419", "data_files": "en-b+es+419.tsv"}, {"config_name": "en_GB-kw", "data_files": "en_GB-kw.tsv"}, {"config_name": "en-no", "data_files": "en-no.tsv"}, {"config_name": "en-wa", "data_files": "en-wa.tsv"}, {"config_name": "en-ber", "data_files": "en-ber.tsv"}, {"config_name": "en_US-es_MX", "data_files": "en_US-es_MX.tsv"}, {"config_name": "en-de_1901", "data_files": "en-de_1901.tsv"}, {"config_name": "en-ja-rJP", "data_files": "en-ja-rJP.tsv"}, {"config_name": "en_US-uk_UA", "data_files": "en_US-uk_UA.tsv"}, {"config_name": "en_US-ja_JP", "data_files": "en_US-ja_JP.tsv"}, {"config_name": "en-b+fr", "data_files": "en-b+fr.tsv"}, {"config_name": "en-pt-br", "data_files": "en-pt-br.tsv"}, {"config_name": "en-te", "data_files": "en-te.tsv"}, {"config_name": "en-np", "data_files": "en-np.tsv"}, {"config_name": "en_GB-gu", "data_files": "en_GB-gu.tsv"}, {"config_name": "en_GB-ki", "data_files": "en_GB-ki.tsv"}, {"config_name": "en-kab-KAB", "data_files": "en-kab-KAB.tsv"}, {"config_name": "de-fr", "data_files": "de-fr.tsv"}, {"config_name": "en-ru_old", "data_files": "en-ru_old.tsv"}, {"config_name": "en_devel-es_do", "data_files": "en_devel-es_do.tsv"}, {"config_name": "en-ua", "data_files": "en-ua.tsv"}, {"config_name": "en-et_EE", "data_files": "en-et_EE.tsv"}, {"config_name": "ia-it", "data_files": "ia-it.tsv"}, {"config_name": "en_GB-ro", "data_files": "en_GB-ro.tsv"}, {"config_name": "en_US-pt-rPT", "data_files": "en_US-pt-rPT.tsv"}, {"config_name": "en-ur_PK", "data_files": "en-ur_PK.tsv"}, {"config_name": "en-pa-rPK", "data_files": "en-pa-rPK.tsv"}, {"config_name": "en-vec", "data_files": "en-vec.tsv"}, {"config_name": "en-nl-rBE", "data_files": "en-nl-rBE.tsv"}, {"config_name": "en-lv", "data_files": "en-lv.tsv"}, {"config_name": "en-ar-rBH", "data_files": "en-ar-rBH.tsv"}, {"config_name": "en-an", "data_files": "en-an.tsv"}, {"config_name": "en_US-sr", "data_files": "en_US-sr.tsv"}, {"config_name": "en-Ukrainian", "data_files": "en-Ukrainian.tsv"}, {"config_name": "en_US-mk", "data_files": "en_US-mk.tsv"}, {"config_name": "en_GB-br", "data_files": "en_GB-br.tsv"}, {"config_name": "en-de@informal", "data_files": "[email protected]"}, {"config_name": "en-dz", "data_files": "en-dz.tsv"}, {"config_name": "en_US-he_IL", "data_files": "en_US-he_IL.tsv"}, {"config_name": "en_GB-mr", "data_files": "en_GB-mr.tsv"}, {"config_name": "en-cs-CARES", "data_files": "en-cs-CARES.tsv"}, {"config_name": "en_US-hi_IN", "data_files": "en_US-hi_IN.tsv"}, {"config_name": "en_US-ro", "data_files": "en_US-ro.tsv"}, {"config_name": "en_US-fr_CA", "data_files": "en_US-fr_CA.tsv"}, {"config_name": "en-as", "data_files": "en-as.tsv"}, {"config_name": "en_GB-ro_MD", "data_files": "en_GB-ro_MD.tsv"}, {"config_name": "en_US-lt-LT", "data_files": "en_US-lt-LT.tsv"}, {"config_name": "fr-ca", "data_files": "fr-ca.tsv"}, {"config_name": "en-be_Latn", "data_files": "en-be_Latn.tsv"}, {"config_name": "en-en-AU", "data_files": "en-en-AU.tsv"}, {"config_name": "en_US-fr_FR", "data_files": "en_US-fr_FR.tsv"}, {"config_name": "en-de-de", "data_files": "en-de-de.tsv"}, {"config_name": "en-nds", "data_files": "en-nds.tsv"}, {"config_name": "en_US-ja", "data_files": "en_US-ja.tsv"}, {"config_name": "en-es-AR", "data_files": "en-es-AR.tsv"}, {"config_name": "en-ms", "data_files": "en-ms.tsv"}, {"config_name": "en-zh-CHS", "data_files": "en-zh-CHS.tsv"}, {"config_name": "en_devel-bs", "data_files": "en_devel-bs.tsv"}, {"config_name": "en-arn", "data_files": "en-arn.tsv"}, {"config_name": "zh_Hans-en", "data_files": "zh_Hans-en.tsv"}, {"config_name": "en-co", "data_files": "en-co.tsv"}, {"config_name": "en-uz_Latn", "data_files": "en-uz_Latn.tsv"}, {"config_name": "en-cs-rCZ", "data_files": "en-cs-rCZ.tsv"}, {"config_name": "en-ku", "data_files": "en-ku.tsv"}, {"config_name": "en-ha", "data_files": "en-ha.tsv"}, {"config_name": "en-de-zuerich-lernt", "data_files": "en-de-zuerich-lernt.tsv"}, {"config_name": "en_US-be", "data_files": "en_US-be.tsv"}, {"config_name": "en-tr", "data_files": "en-tr.tsv"}, {"config_name": "en-ru_ru", "data_files": "en-ru_ru.tsv"}, {"config_name": "en-kl", "data_files": "en-kl.tsv"}, {"config_name": "en-it", "data_files": "en-it.tsv"}, {"config_name": "en-b+be+Latn", "data_files": "en-b+be+Latn.tsv"}, {"config_name": "en_devel-mk", "data_files": "en_devel-mk.tsv"}, {"config_name": "en_US-vi", "data_files": "en_US-vi.tsv"}, {"config_name": "en-zh_CMN-HANT", "data_files": "en-zh_CMN-HANT.tsv"}, {"config_name": "en-mnw", "data_files": "en-mnw.tsv"}, {"config_name": "en_US-sv-SE", "data_files": "en_US-sv-SE.tsv"}, {"config_name": "en-gum", "data_files": "en-gum.tsv"}, {"config_name": "en-my_MM", "data_files": "en-my_MM.tsv"}, {"config_name": "en_GB-mk_MK", "data_files": "en_GB-mk_MK.tsv"}, {"config_name": "en_devel-es_ec", "data_files": "en_devel-es_ec.tsv"}, {"config_name": "en_US-ne", "data_files": "en_US-ne.tsv"}, {"config_name": "nl-zh_Hans", "data_files": "nl-zh_Hans.tsv"}, {"config_name": "en-zh_hans", "data_files": "en-zh_hans.tsv"}, {"config_name": "en-sr-rCS", "data_files": "en-sr-rCS.tsv"}, {"config_name": "en-es_NI", "data_files": "en-es_NI.tsv"}, {"config_name": "en_GB-bs", "data_files": "en_GB-bs.tsv"}, {"config_name": "en_GB-tr_TR", "data_files": "en_GB-tr_TR.tsv"}, {"config_name": "ru-en", "data_files": "ru-en.tsv"}, {"config_name": "en_US-my", "data_files": "en_US-my.tsv"}, {"config_name": "en-ia", "data_files": "en-ia.tsv"}, {"config_name": "en-hu-HU", "data_files": "en-hu-HU.tsv"}, {"config_name": "en-nn_NO", "data_files": "en-nn_NO.tsv"}, {"config_name": "en_GB-es_419", "data_files": "en_GB-es_419.tsv"}, {"config_name": "en-ca-rES", "data_files": "en-ca-rES.tsv"}, {"config_name": "en_US-zh-CN", "data_files": "en_US-zh-CN.tsv"}, {"config_name": "en_US-tzm", "data_files": "en_US-tzm.tsv"}, {"config_name": "en-it_CARES", "data_files": "en-it_CARES.tsv"}, {"config_name": "en_GB-he", "data_files": "en_GB-he.tsv"}, {"config_name": "en_US-sn", "data_files": "en_US-sn.tsv"}, {"config_name": "en-ml_IN", "data_files": "en-ml_IN.tsv"}, {"config_name": "en-guc", "data_files": "en-guc.tsv"}, {"config_name": "zh_Hans-ru", "data_files": "zh_Hans-ru.tsv"}, {"config_name": "en-csb", "data_files": "en-csb.tsv"}, {"config_name": "en-nan", "data_files": "en-nan.tsv"}, {"config_name": "en-fa-IR", "data_files": "en-fa-IR.tsv"}, {"config_name": "en_US-en_CA", "data_files": "en_US-en_CA.tsv"}, {"config_name": "en_GB-ar", "data_files": "en_GB-ar.tsv"}, {"config_name": "en_GB-ia_FR", "data_files": "en_GB-ia_FR.tsv"}, {"config_name": "en_US-es-MX", "data_files": "en_US-es-MX.tsv"}, {"config_name": "en_devel-el", "data_files": "en_devel-el.tsv"}, {"config_name": "en_GB-ach", "data_files": "en_GB-ach.tsv"}, {"config_name": "en-Italian", "data_files": "en-Italian.tsv"}, {"config_name": "en_devel-az", "data_files": "en_devel-az.tsv"}, {"config_name": "eo-ru", "data_files": "eo-ru.tsv"}, {"config_name": "en-es_US", "data_files": "en-es_US.tsv"}, {"config_name": "en_devel-cy", "data_files": "en_devel-cy.tsv"}, {"config_name": "en-es-mx", "data_files": "en-es-mx.tsv"}, {"config_name": "en-en-rCA", "data_files": "en-en-rCA.tsv"}, {"config_name": "en-kn-IN", "data_files": "en-kn-IN.tsv"}, {"config_name": "en_devel-zh_CN", "data_files": "en_devel-zh_CN.tsv"}, {"config_name": "en_US-lt_LT", "data_files": "en_US-lt_LT.tsv"}, {"config_name": "en_GB-id_ID", "data_files": "en_GB-id_ID.tsv"}, {"config_name": "en-mt", "data_files": "en-mt.tsv"}, {"config_name": "en-bar", "data_files": "en-bar.tsv"}, {"config_name": "en-kr", "data_files": "en-kr.tsv"}, {"config_name": "en_GB-de-DE", "data_files": "en_GB-de-DE.tsv"}, {"config_name": "en-zgh", "data_files": "en-zgh.tsv", "default": true}, {"config_name": "en-german", "data_files": "en-german.tsv"}, {"config_name": "en-de_ch", "data_files": "en-de_ch.tsv"}, {"config_name": "en_devel-hy", "data_files": "en_devel-hy.tsv"}, {"config_name": "en_GB-hr", "data_files": "en_GB-hr.tsv"}, {"config_name": "en_GB-ca_AD", "data_files": "en_GB-ca_AD.tsv"}, {"config_name": "en-b+ca+VALENCIA", "data_files": "en-b+ca+VALENCIA.tsv"}, {"config_name": "en-rw", "data_files": "en-rw.tsv"}, {"config_name": "en-fil-FIL", "data_files": "en-fil-FIL.tsv"}, {"config_name": "it-de", "data_files": "it-de.tsv"}, {"config_name": "en_US-es-rMX", "data_files": "en_US-es-rMX.tsv"}, {"config_name": "en-sk-SK", "data_files": "en-sk-SK.tsv"}, {"config_name": "en-my-MM", "data_files": "en-my-MM.tsv"}, {"config_name": "en-es_ve", "data_files": "en-es_ve.tsv"}, {"config_name": "en-fra-rFR", "data_files": "en-fra-rFR.tsv"}, {"config_name": "en_GB-gv", "data_files": "en_GB-gv.tsv"}, {"config_name": "en-ml-IN", "data_files": "en-ml-IN.tsv"}, {"config_name": "en_US-zh-rHK", "data_files": "en_US-zh-rHK.tsv"}, {"config_name": "en-fur", "data_files": "en-fur.tsv"}, {"config_name": "en_GB-sv", "data_files": "en_GB-sv.tsv"}, {"config_name": "en-ne-rNP", "data_files": "en-ne-rNP.tsv"}, {"config_name": "en_GB-fr", "data_files": "en_GB-fr.tsv"}, {"config_name": "en_US-qya", "data_files": "en_US-qya.tsv"}, {"config_name": "en-ja_KS", "data_files": "en-ja_KS.tsv"}, {"config_name": "en-en_uwu_x", "data_files": "en-en_uwu_x.tsv"}, {"config_name": "en-zh_CN", "data_files": "en-zh_CN.tsv"}, {"config_name": "en-az_AZ", "data_files": "en-az_AZ.tsv"}, {"config_name": "en-bem", "data_files": "en-bem.tsv"}, {"config_name": "en-ars", "data_files": "en-ars.tsv"}, {"config_name": "en-xh", "data_files": "en-xh.tsv"}, {"config_name": "en_US-zh_Hant_HK", "data_files": "en_US-zh_Hant_HK.tsv"}, {"config_name": "en_US-en-rGB", "data_files": "en_US-en-rGB.tsv"}, {"config_name": "en-pam", "data_files": "en-pam.tsv"}, {"config_name": "en_devel-zh-rCN", "data_files": "en_devel-zh-rCN.tsv"}, {"config_name": "en-zh_LATN@pinyin", "data_files": "[email protected]"}, {"config_name": "en_US-en_NZ", "data_files": "en_US-en_NZ.tsv"}, {"config_name": "en-nb_no", "data_files": "en-nb_no.tsv"}, {"config_name": "en-bn-rBD", "data_files": "en-bn-rBD.tsv"}, {"config_name": "en-pl_PL", "data_files": "en-pl_PL.tsv"}, {"config_name": "en-romanian", "data_files": "en-romanian.tsv"}, {"config_name": "en_US-ja_KANJI", "data_files": "en_US-ja_KANJI.tsv"}, {"config_name": "en_US-zh-rCN", "data_files": "en_US-zh-rCN.tsv"}, {"config_name": "en-ca_es", "data_files": "en-ca_es.tsv"}, {"config_name": "en-de_de", "data_files": "en-de_de.tsv"}, {"config_name": "en-rom", "data_files": "en-rom.tsv"}, {"config_name": "en_devel-lv", "data_files": "en_devel-lv.tsv"}, {"config_name": "en-ro", "data_files": "en-ro.tsv"}, {"config_name": "en_US-th-TH", "data_files": "en_US-th-TH.tsv"}, {"config_name": "en_GB-wal", "data_files": "en_GB-wal.tsv"}, {"config_name": "en_US-fi-FI", "data_files": "en_US-fi-FI.tsv"}, {"config_name": "en-ar_AR", "data_files": "en-ar_AR.tsv"}, {"config_name": "en_US-el", "data_files": "en_US-el.tsv"}, {"config_name": "en_GB-chr", "data_files": "en_GB-chr.tsv"}, {"config_name": "en-pbb", "data_files": "en-pbb.tsv"}, {"config_name": "en-ar-rXB", "data_files": "en-ar-rXB.tsv"}, {"config_name": "en-tzm", "data_files": "en-tzm.tsv"}, {"config_name": "en-mr-rIN", "data_files": "en-mr-rIN.tsv"}, {"config_name": "en-ms-rMY", "data_files": "en-ms-rMY.tsv"}, {"config_name": "en-apc", "data_files": "en-apc.tsv"}, {"config_name": "en_GB-fi", "data_files": "en_GB-fi.tsv"}, {"config_name": "en_US-hi", "data_files": "en_US-hi.tsv"}, {"config_name": "en-hz", "data_files": "en-hz.tsv"}, {"config_name": "en_GB-mi", "data_files": "en_GB-mi.tsv"}, {"config_name": "en-sai", "data_files": "en-sai.tsv"}, {"config_name": "en-ig", "data_files": "en-ig.tsv"}, {"config_name": "en-en_Shaw", "data_files": "en-en_Shaw.tsv"}, {"config_name": "en_US-fa_IR", "data_files": "en_US-fa_IR.tsv"}, {"config_name": "en-mr", "data_files": "en-mr.tsv"}, {"config_name": "en-pl_PL_rude", "data_files": "en-pl_PL_rude.tsv"}, {"config_name": "en-cv", "data_files": "en-cv.tsv"}, {"config_name": "messages-ar", "data_files": "messages-ar.tsv"}, {"config_name": "en-ko_KO", "data_files": "en-ko_KO.tsv"}, {"config_name": "en_US-zh-hans", "data_files": "en_US-zh-hans.tsv"}, {"config_name": "en-ga-IE", "data_files": "en-ga-IE.tsv"}, {"config_name": "en-am", "data_files": "en-am.tsv"}, {"config_name": "en-ug", "data_files": "en-ug.tsv"}, {"config_name": "en-af_ZA", "data_files": "en-af_ZA.tsv"}, {"config_name": "en-ES", "data_files": "en-ES.tsv"}, {"config_name": "en_US-ru_RU", "data_files": "en_US-ru_RU.tsv"}, {"config_name": "en_GB-lv", "data_files": "en_GB-lv.tsv"}, {"config_name": "en-yi", "data_files": "en-yi.tsv"}, {"config_name": "en_GB-pl", "data_files": "en_GB-pl.tsv"}, {"config_name": "en_GB-tl", "data_files": "en_GB-tl.tsv"}, {"config_name": "en-km", "data_files": "en-km.tsv"}, {"config_name": "en-azb", "data_files": "en-azb.tsv"}, {"config_name": "en_devel-fr", "data_files": "en_devel-fr.tsv"}, {"config_name": "en-pa-PK", "data_files": "en-pa-PK.tsv"}, {"config_name": "en-tn", "data_files": "en-tn.tsv"}, {"config_name": "en-mjw", "data_files": "en-mjw.tsv"}, {"config_name": "en-frs", "data_files": "en-frs.tsv"}, {"config_name": "en-it-IT", "data_files": "en-it-IT.tsv"}, {"config_name": "en-ro_RO", "data_files": "en-ro_RO.tsv"}, {"config_name": "en_US-nl_NL", "data_files": "en_US-nl_NL.tsv"}, {"config_name": "en-ht", "data_files": "en-ht.tsv"}, {"config_name": "en_devel-es_cr", "data_files": "en_devel-es_cr.tsv"}, {"config_name": "en_US-zh-rTW", "data_files": "en_US-zh-rTW.tsv"}, {"config_name": "en-fo", "data_files": "en-fo.tsv"}, {"config_name": "en-skr", "data_files": "en-skr.tsv"}, {"config_name": "en-ak", "data_files": "en-ak.tsv"}, {"config_name": "en_GB-sr@latin", "data_files": "[email protected]"}, {"config_name": "en_US-de_CH", "data_files": "en_US-de_CH.tsv"}, {"config_name": "en_US-uk-UA", "data_files": "en_US-uk-UA.tsv"}, {"config_name": "en-ko_KR", "data_files": "en-ko_KR.tsv"}, {"config_name": "en-cy", "data_files": "en-cy.tsv"}, {"config_name": "en-galo", "data_files": "en-galo.tsv"}, {"config_name": "en-bn_BD", "data_files": "en-bn_BD.tsv"}, {"config_name": "en_devel-ms", "data_files": "en_devel-ms.tsv"}, {"config_name": "fr-it", "data_files": "fr-it.tsv"}, {"config_name": "en-ny", "data_files": "en-ny.tsv"}, {"config_name": "en-tet", "data_files": "en-tet.tsv"}, {"config_name": "en_GB-sk", "data_files": "en_GB-sk.tsv"}, {"config_name": "eo-ar", "data_files": "eo-ar.tsv"}, {"config_name": "eo-es", "data_files": "eo-es.tsv"}, {"config_name": "en-bho", "data_files": "en-bho.tsv"}, {"config_name": "en-pap", "data_files": "en-pap.tsv"}, {"config_name": "en-vi_VN", "data_files": "en-vi_VN.tsv"}, {"config_name": "en_US-ar", "data_files": "en_US-ar.tsv"}, {"config_name": "en_devel-nb", "data_files": "en_devel-nb.tsv"}, {"config_name": "en_devel-es_mx", "data_files": "en_devel-es_mx.tsv"}, {"config_name": "es-ca", "data_files": "es-ca.tsv"}, {"config_name": "en_GB-kn", "data_files": "en_GB-kn.tsv"}, {"config_name": "en-ru_UA", "data_files": "en-ru_UA.tsv"}, {"config_name": "sv-nb", "data_files": "sv-nb.tsv"}, {"config_name": "en_GB-zh_Hans", "data_files": "en_GB-zh_Hans.tsv"}, {"config_name": "en-he-IL", "data_files": "en-he-IL.tsv"}, {"config_name": "en_GB-et", "data_files": "en_GB-et.tsv"}, {"config_name": "es-pl", "data_files": "es-pl.tsv"}, {"config_name": "en-hy-AM", "data_files": "en-hy-AM.tsv"}, {"config_name": "en_US-cy", "data_files": "en_US-cy.tsv"}, {"config_name": "en-hu-rZZ", "data_files": "en-hu-rZZ.tsv"}, {"config_name": "en-by", "data_files": "en-by.tsv"}, {"config_name": "en_GB-hy", "data_files": "en_GB-hy.tsv"}, {"config_name": "en_US-zh-Hant", "data_files": "en_US-zh-Hant.tsv"}, {"config_name": "en-gu-IN", "data_files": "en-gu-IN.tsv"}, {"config_name": "en_GB-ml_IN", "data_files": "en_GB-ml_IN.tsv"}, {"config_name": "de-nl", "data_files": "de-nl.tsv"}, {"config_name": "en_devel-ur", "data_files": "en_devel-ur.tsv"}, {"config_name": "en-ca-ES", "data_files": "en-ca-ES.tsv"}, {"config_name": "en_GB-kl", "data_files": "en_GB-kl.tsv"}, {"config_name": "en_US-ta_IN", "data_files": "en_US-ta_IN.tsv"}, {"config_name": "en_US-sk_SK", "data_files": "en_US-sk_SK.tsv"}, {"config_name": "en-zh_Latn", "data_files": "en-zh_Latn.tsv"}, {"config_name": "en_GB-es", "data_files": "en_GB-es.tsv"}, {"config_name": "en-en_uk", "data_files": "en-en_uk.tsv"}, {"config_name": "en_GB-ru", "data_files": "en_GB-ru.tsv"}, {"config_name": "en-gu", "data_files": "en-gu.tsv"}, {"config_name": "en_US-km", "data_files": "en_US-km.tsv"}, {"config_name": "en_GB-uz", "data_files": "en_GB-uz.tsv"}, {"config_name": "en_US-yue-HK", "data_files": "en_US-yue-HK.tsv"}, {"config_name": "en-ceb", "data_files": "en-ceb.tsv"}, {"config_name": "en-is", "data_files": "en-is.tsv"}, {"config_name": "en-ug@Arab", "data_files": "[email protected]"}, {"config_name": "es-ru", "data_files": "es-ru.tsv"}, {"config_name": "en-pt", "data_files": "en-pt.tsv"}, {"config_name": "en-es-US", "data_files": "en-es-US.tsv"}, {"config_name": "en-zh-rCMN-HANT", "data_files": "en-zh-rCMN-HANT.tsv"}, {"config_name": "en-jbo-EN", "data_files": "en-jbo-EN.tsv"}, {"config_name": "en_US-pa", "data_files": "en_US-pa.tsv"}, {"config_name": "en_US-or", "data_files": "en_US-or.tsv"}, {"config_name": "dev-hu", "data_files": "dev-hu.tsv"}, {"config_name": "en-b+ast", "data_files": "en-b+ast.tsv"}, {"config_name": "messages-vi", "data_files": "messages-vi.tsv"}, {"config_name": "en-ht-HT", "data_files": "en-ht-HT.tsv"}, {"config_name": "en-ar_AA", "data_files": "en-ar_AA.tsv"}, {"config_name": "en-mcc234", "data_files": "en-mcc234.tsv"}, {"config_name": "en_GB-he_IL", "data_files": "en_GB-he_IL.tsv"}, {"config_name": "en-fr_FR", "data_files": "en-fr_FR.tsv"}, {"config_name": "en-es_ES", "data_files": "en-es_ES.tsv"}, {"config_name": "en-tr-v26", "data_files": "en-tr-v26.tsv"}, {"config_name": "ru-kk", "data_files": "ru-kk.tsv"}, {"config_name": "en_GB-ky", "data_files": "en_GB-ky.tsv"}, {"config_name": "en-st", "data_files": "en-st.tsv"}, {"config_name": "en-ky", "data_files": "en-ky.tsv"}, {"config_name": "en_GB-fa", "data_files": "en_GB-fa.tsv"}, {"config_name": "en-ta", "data_files": "en-ta.tsv"}, {"config_name": "en_US-ru-RU", "data_files": "en_US-ru-RU.tsv"}, {"config_name": "en_US-it", "data_files": "en_US-it.tsv"}, {"config_name": "en-mai", "data_files": "en-mai.tsv"}, {"config_name": "en_GB-ga", "data_files": "en_GB-ga.tsv"}, {"config_name": "en-ay", "data_files": "en-ay.tsv"}, {"config_name": "en-pt_PT", "data_files": "en-pt_PT.tsv"}, {"config_name": "en-fa-rIR", "data_files": "en-fa-rIR.tsv"}, {"config_name": "en-sk_SK", "data_files": "en-sk_SK.tsv"}, {"config_name": "en-ru_sov", "data_files": "en-ru_sov.tsv"}, {"config_name": "en-pt-PT", "data_files": "en-pt-PT.tsv"}, {"config_name": "en_US-ko-KR", "data_files": "en_US-ko-KR.tsv"}, {"config_name": "en-es-rCO", "data_files": "en-es-rCO.tsv"}, {"config_name": "en-zh", "data_files": "en-zh.tsv"}, {"config_name": "en_US-ber", "data_files": "en_US-ber.tsv"}, {"config_name": "en-en_NZ", "data_files": "en-en_NZ.tsv"}, {"config_name": "eo-hi", "data_files": "eo-hi.tsv"}, {"config_name": "en_US-kab", "data_files": "en_US-kab.tsv"}, {"config_name": "en_GB-ru_RU", "data_files": "en_GB-ru_RU.tsv"}, {"config_name": "en-kok@latin", "data_files": "[email protected]"}, {"config_name": "en-ne_NP", "data_files": "en-ne_NP.tsv"}, {"config_name": "en-no-NO", "data_files": "en-no-NO.tsv"}, {"config_name": "it-nl_NL", "data_files": "it-nl_NL.tsv"}, {"config_name": "en-HE", "data_files": "en-HE.tsv"}, {"config_name": "eo-ja", "data_files": "eo-ja.tsv"}, {"config_name": "en_US-kmr", "data_files": "en_US-kmr.tsv"}, {"config_name": "en-pt-BR", "data_files": "en-pt-BR.tsv"}, {"config_name": "en-pl-v26", "data_files": "en-pl-v26.tsv"}, {"config_name": "en_devel-zh-tw", "data_files": "en_devel-zh-tw.tsv"}, {"config_name": "en-mcc235", "data_files": "en-mcc235.tsv"}, {"config_name": "en-el-gr", "data_files": "en-el-gr.tsv"}, {"config_name": "en-ga", "data_files": "en-ga.tsv"}, {"config_name": "en_GB-zh_CN", "data_files": "en_GB-zh_CN.tsv"}, {"config_name": "en_GB-kab", "data_files": "en_GB-kab.tsv"}, {"config_name": "en-te-IN", "data_files": "en-te-IN.tsv"}, {"config_name": "en_GB-de", "data_files": "en_GB-de.tsv"}, {"config_name": "und-de", "data_files": "und-de.tsv"}, {"config_name": "en-nb-rNO-v26", "data_files": "en-nb-rNO-v26.tsv"}, {"config_name": "en-zh_SIMPLIFIED", "data_files": "en-zh_SIMPLIFIED.tsv"}, {"config_name": "en-ur-rPK", "data_files": "en-ur-rPK.tsv"}, {"config_name": "en_US-zh-cn", "data_files": "en_US-zh-cn.tsv"}, {"config_name": "en_devel-pa", "data_files": "en_devel-pa.tsv"}, {"config_name": "en-aii", "data_files": "en-aii.tsv"}, {"config_name": "en_GB-it_IT", "data_files": "en_GB-it_IT.tsv"}, {"config_name": "en_GB-yo", "data_files": "en_GB-yo.tsv"}, {"config_name": "de-id", "data_files": "de-id.tsv"}, {"config_name": "en_GB-nv", "data_files": "en_GB-nv.tsv"}, {"config_name": "en-sw-KE", "data_files": "en-sw-KE.tsv"}, {"config_name": "en_US-so", "data_files": "en_US-so.tsv"}, {"config_name": "en-yue", "data_files": "en-yue.tsv"}, {"config_name": "en-ps", "data_files": "en-ps.tsv"}, {"config_name": "en-mr-IN", "data_files": "en-mr-IN.tsv"}, {"config_name": "de-cs", "data_files": "de-cs.tsv"}, {"config_name": "en_GB-pt-BR", "data_files": "en_GB-pt-BR.tsv"}, {"config_name": "en-ne", "data_files": "en-ne.tsv"}, {"config_name": "en_GB-kk", "data_files": "en_GB-kk.tsv"}, {"config_name": "en-af-ZA", "data_files": "en-af-ZA.tsv"}, {"config_name": "en-pa", "data_files": "en-pa.tsv"}, {"config_name": "en_US-lt", "data_files": "en_US-lt.tsv"}, {"config_name": "en-b+qtq+Latn", "data_files": "en-b+qtq+Latn.tsv"}, {"config_name": "zh_Hant-zgh", "data_files": "zh_Hant-zgh.tsv"}, {"config_name": "en-ta-IN", "data_files": "en-ta-IN.tsv"}, {"config_name": "en_GB-hu", "data_files": "en_GB-hu.tsv"}, {"config_name": "en-iw", "data_files": "en-iw.tsv"}, {"config_name": "es-hi", "data_files": "es-hi.tsv"}, {"config_name": "en-es_EC", "data_files": "en-es_EC.tsv"}, {"config_name": "en-ukrainian", "data_files": "en-ukrainian.tsv"}, {"config_name": "en_US-he", "data_files": "en_US-he.tsv"}, {"config_name": "en_GB-sl", "data_files": "en_GB-sl.tsv"}, {"config_name": "en_devel-sgs", "data_files": "en_devel-sgs.tsv"}, {"config_name": "en_US-zh-HK", "data_files": "en_US-zh-HK.tsv"}, {"config_name": "en_US-th_TH", "data_files": "en_US-th_TH.tsv"}, {"config_name": "en-nl_NL", "data_files": "en-nl_NL.tsv"}, {"config_name": "en-zh-HK", "data_files": "en-zh-HK.tsv"}, {"config_name": "en-zh-hans", "data_files": "en-zh-hans.tsv"}, {"config_name": "en_devel-he", "data_files": "en_devel-he.tsv"}, {"config_name": "en_GB-ur", "data_files": "en_GB-ur.tsv"}, {"config_name": "en_GB-da", "data_files": "en_GB-da.tsv"}, {"config_name": "en_GB-bn", "data_files": "en_GB-bn.tsv"}, {"config_name": "en-chinese", "data_files": "en-chinese.tsv"}, {"config_name": "en-bg-BG", "data_files": "en-bg-BG.tsv"}, {"config_name": "en_devel-jpn_JP", "data_files": "en_devel-jpn_JP.tsv"}, {"config_name": "en_devel-id", "data_files": "en_devel-id.tsv"}, {"config_name": "und-ru", "data_files": "und-ru.tsv"}, {"config_name": "en_devel-in", "data_files": "en_devel-in.tsv"}, {"config_name": "en-wo", "data_files": "en-wo.tsv"}, {"config_name": "nl-da", "data_files": "nl-da.tsv"}, {"config_name": "en-pa-Arab-PK", "data_files": "en-pa-Arab-PK.tsv"}, {"config_name": "en-gr-GR", "data_files": "en-gr-GR.tsv"}, {"config_name": "en-az-AZ", "data_files": "en-az-AZ.tsv"}, {"config_name": "en-bg", "data_files": "en-bg.tsv"}, {"config_name": "en-es-rAR", "data_files": "en-es-rAR.tsv"}, {"config_name": "en-nb-NO", "data_files": "en-nb-NO.tsv"}, {"config_name": "en_UK-bg_BG", "data_files": "en_UK-bg_BG.tsv"}, {"config_name": "en_GB-pap", "data_files": "en_GB-pap.tsv"}, {"config_name": "en_US-es", "data_files": "en_US-es.tsv"}, {"config_name": "en_US-hu", "data_files": "en_US-hu.tsv"}, {"config_name": "en-or-IN", "data_files": "en-or-IN.tsv"}, {"config_name": "en-guw", "data_files": "en-guw.tsv"}, {"config_name": "en-nl-BE", "data_files": "en-nl-BE.tsv"}, {"config_name": "en-ml-rIN", "data_files": "en-ml-rIN.tsv"}, {"config_name": "en-ji", "data_files": "en-ji.tsv"}, {"config_name": "en_US-ta", "data_files": "en_US-ta.tsv"}, {"config_name": "es-ur", "data_files": "es-ur.tsv"}, {"config_name": "en-br", "data_files": "en-br.tsv"}, {"config_name": "de-en", "data_files": "de-en.tsv"}, {"config_name": "dev-fr", "data_files": "dev-fr.tsv"}, {"config_name": "en-ace", "data_files": "en-ace.tsv"}, {"config_name": "en_US-zh_TW", "data_files": "en_US-zh_TW.tsv"}, {"config_name": "en-oj", "data_files": "en-oj.tsv"}, {"config_name": "en-zh_tw", "data_files": "en-zh_tw.tsv"}, {"config_name": "en-cnr", "data_files": "en-cnr.tsv"}, {"config_name": "en_devel-es_hn", "data_files": "en_devel-es_hn.tsv"}, {"config_name": "dev-uk", "data_files": "dev-uk.tsv"}, {"config_name": "en-ru_CARES", "data_files": "en-ru_CARES.tsv"}, {"config_name": "en-uroc", "data_files": "en-uroc.tsv"}, {"config_name": "en_GB-bg_BG", "data_files": "en_GB-bg_BG.tsv"}, {"config_name": "en_GB-ar_SA", "data_files": "en_GB-ar_SA.tsv"}, {"config_name": "en_US-fy", "data_files": "en_US-fy.tsv"}, {"config_name": "en-lt", "data_files": "en-lt.tsv"}, {"config_name": "en-de-rDE", "data_files": "en-de-rDE.tsv"}, {"config_name": "en_US-ast", "data_files": "en_US-ast.tsv"}, {"config_name": "en_US-ko_KR", "data_files": "en_US-ko_KR.tsv"}, {"config_name": "en_devel-ar_DZ", "data_files": "en_devel-ar_DZ.tsv"}, {"config_name": "en_devel-hu", "data_files": "en_devel-hu.tsv"}, {"config_name": "en-fr_BE", "data_files": "en-fr_BE.tsv"}, {"config_name": "en-kmr", "data_files": "en-kmr.tsv"}, {"config_name": "en_devel-ro_ro", "data_files": "en_devel-ro_ro.tsv"}, {"config_name": "en_GB-vi_VN", "data_files": "en_GB-vi_VN.tsv"}, {"config_name": "en_devel-sk", "data_files": "en_devel-sk.tsv"}, {"config_name": "und-nl_BE", "data_files": "und-nl_BE.tsv"}, {"config_name": "eo-bn", "data_files": "eo-bn.tsv"}, {"config_name": "en-hungarian", "data_files": "en-hungarian.tsv"}, {"config_name": "en_GB-ta", "data_files": "en_GB-ta.tsv"}, {"config_name": "en_US-ca", "data_files": "en_US-ca.tsv"}, {"config_name": "en-oc", "data_files": "en-oc.tsv"}, {"config_name": "en_US-bg_BG", "data_files": "en_US-bg_BG.tsv"}, {"config_name": "en-hr", "data_files": "en-hr.tsv"}, {"config_name": "en_GB-zh_Hant", "data_files": "en_GB-zh_Hant.tsv"}, {"config_name": "en_GB-bn_BD", "data_files": "en_GB-bn_BD.tsv"}, {"config_name": "en-ca@valencia", "data_files": "[email protected]"}, {"config_name": "en_GB-mai", "data_files": "en_GB-mai.tsv"}, {"config_name": "en-uk-UA", "data_files": "en-uk-UA.tsv"}, {"config_name": "en-frm", "data_files": "en-frm.tsv"}, {"config_name": "en-bd", "data_files": "en-bd.tsv"}, {"config_name": "en_GB-ja", "data_files": "en_GB-ja.tsv"}, {"config_name": "en_US-sw", "data_files": "en_US-sw.tsv"}, {"config_name": "eo-uk", "data_files": "eo-uk.tsv"}, {"config_name": "en_US-es-rAR", "data_files": "en_US-es-rAR.tsv"}, {"config_name": "en-az-rAZ", "data_files": "en-az-rAZ.tsv"}, {"config_name": "en_GB-es-ES", "data_files": "en_GB-es-ES.tsv"}, {"config_name": "en-sl-SL", "data_files": "en-sl-SL.tsv"}, {"config_name": "en-pms", "data_files": "en-pms.tsv"}, {"config_name": "en_GB-te", "data_files": "en_GB-te.tsv"}, {"config_name": "it-de_DE", "data_files": "it-de_DE.tsv"}, {"config_name": "en-yue_Hant", "data_files": "en-yue_Hant.tsv"}, {"config_name": "en-en-rIN", "data_files": "en-en-rIN.tsv"}, {"config_name": "en-ln", "data_files": "en-ln.tsv"}, {"config_name": "en-pt-rBR", "data_files": "en-pt-rBR.tsv"}, {"config_name": "en_US-az_AZ", "data_files": "en_US-az_AZ.tsv"}, {"config_name": "en-pl-rPL", "data_files": "en-pl-rPL.tsv"}, {"config_name": "eo-el", "data_files": "eo-el.tsv"}, {"config_name": "eo-ms", "data_files": "eo-ms.tsv"}, {"config_name": "en_US-tr", "data_files": "en_US-tr.tsv"}, {"config_name": "en-en_SHAW", "data_files": "en-en_SHAW.tsv"}, {"config_name": "en-ar-rIQ", "data_files": "en-ar-rIQ.tsv"}, {"config_name": "en-yo", "data_files": "en-yo.tsv"}, {"config_name": "en-japanese", "data_files": "en-japanese.tsv"}, {"config_name": "es-id", "data_files": "es-id.tsv"}, {"config_name": "en-fa_AF", "data_files": "en-fa_AF.tsv"}, {"config_name": "en_GB-ms", "data_files": "en_GB-ms.tsv"}, {"config_name": "en-Zh-CHS", "data_files": "en-Zh-CHS.tsv"}, {"config_name": "en_GB-mt", "data_files": "en_GB-mt.tsv"}, {"config_name": "en-b+de", "data_files": "en-b+de.tsv"}, {"config_name": "en_US-fi", "data_files": "en_US-fi.tsv"}, {"config_name": "de-ar", "data_files": "de-ar.tsv"}, {"config_name": "en-en-GB", "data_files": "en-en-GB.tsv"}, {"config_name": "en-mo", "data_files": "en-mo.tsv"}, {"config_name": "en_devel-zh_Hans", "data_files": "en_devel-zh_Hans.tsv"}, {"config_name": "en_GB-dz", "data_files": "en_GB-dz.tsv"}, {"config_name": "en_US-gl", "data_files": "en_US-gl.tsv"}, {"config_name": "en-pt-rPT", "data_files": "en-pt-rPT.tsv"}, {"config_name": "en_devel-es_pr", "data_files": "en_devel-es_pr.tsv"}, {"config_name": "en-RU", "data_files": "en-RU.tsv"}, {"config_name": "en-en-rUS", "data_files": "en-en-rUS.tsv"}, {"config_name": "en-sv_se", "data_files": "en-sv_se.tsv"}, {"config_name": "en-italian", "data_files": "en-italian.tsv"}, {"config_name": "en_US-lv", "data_files": "en_US-lv.tsv"}, {"config_name": "de-ru", "data_files": "de-ru.tsv"}, {"config_name": "en-sc", "data_files": "en-sc.tsv"}, {"config_name": "en-gv", "data_files": "en-gv.tsv"}, {"config_name": "en_US-pt_PT", "data_files": "en_US-pt_PT.tsv"}, {"config_name": "en_GB-bn_IN", "data_files": "en_GB-bn_IN.tsv"}, {"config_name": "en_US-fr-FR", "data_files": "en_US-fr-FR.tsv"}, {"config_name": "ia-es", "data_files": "ia-es.tsv"}, {"config_name": "en_US-es_UY", "data_files": "en_US-es_UY.tsv"}, {"config_name": "en_GB-hr_HR", "data_files": "en_GB-hr_HR.tsv"}, {"config_name": "en-id_ID", "data_files": "en-id_ID.tsv"}, {"config_name": "en-es_VE", "data_files": "en-es_VE.tsv"}, {"config_name": "en-ie", "data_files": "en-ie.tsv"}, {"config_name": "en-it_IT", "data_files": "en-it_IT.tsv"}, {"config_name": "en_GB-si_LK", "data_files": "en_GB-si_LK.tsv"}, {"config_name": "en-nqo", "data_files": "en-nqo.tsv"}, {"config_name": "pl-uk", "data_files": "pl-uk.tsv"}, {"config_name": "en-sco", "data_files": "en-sco.tsv"}, {"config_name": "en_US-tr-TR", "data_files": "en_US-tr-TR.tsv"}, {"config_name": "en-en_GB", "data_files": "en-en_GB.tsv"}, {"config_name": "en-b+kab", "data_files": "en-b+kab.tsv"}, {"config_name": "en-he-rIL", "data_files": "en-he-rIL.tsv"}, {"config_name": "en-pu", "data_files": "en-pu.tsv"}, {"config_name": "de-lb", "data_files": "de-lb.tsv"}, {"config_name": "en-is_IS", "data_files": "en-is_IS.tsv"}, {"config_name": "en_US-cs", "data_files": "en_US-cs.tsv"}, {"config_name": "en_GB-nah", "data_files": "en_GB-nah.tsv"}, {"config_name": "de-tr", "data_files": "de-tr.tsv"}, {"config_name": "zh_Hant-en_US", "data_files": "zh_Hant-en_US.tsv"}, {"config_name": "pl-ru", "data_files": "pl-ru.tsv"}, {"config_name": "en-zh-TW", "data_files": "en-zh-TW.tsv"}, {"config_name": "en_GB-kok", "data_files": "en_GB-kok.tsv"}, {"config_name": "en_US-zh-Hans", "data_files": "en_US-zh-Hans.tsv"}, {"config_name": "en_devel-da", "data_files": "en_devel-da.tsv"}, {"config_name": "en-mg", "data_files": "en-mg.tsv"}, {"config_name": "en-pa-rIN", "data_files": "en-pa-rIN.tsv"}, {"config_name": "en-nb_NO", "data_files": "en-nb_NO.tsv"}, {"config_name": "en_GB-az", "data_files": "en_GB-az.tsv"}, {"config_name": "en-ca_valencia", "data_files": "en-ca_valencia.tsv"}, {"config_name": "en-su", "data_files": "en-su.tsv"}, {"config_name": "und-sv", "data_files": "und-sv.tsv"}, {"config_name": "pl-en", "data_files": "pl-en.tsv"}, {"config_name": "en-ar-rDZ", "data_files": "en-ar-rDZ.tsv"}, {"config_name": "en_US-eo", "data_files": "en_US-eo.tsv"}, {"config_name": "en_US-sq", "data_files": "en_US-sq.tsv"}, {"config_name": "en-sl-rSI", "data_files": "en-sl-rSI.tsv"}, {"config_name": "en-uk-rUA", "data_files": "en-uk-rUA.tsv"}, {"config_name": "en_devel-te", "data_files": "en_devel-te.tsv"}, {"config_name": "en-da_DK", "data_files": "en-da_DK.tsv"}, {"config_name": "en_GB-et_EE", "data_files": "en_GB-et_EE.tsv"}, {"config_name": "en-et-EE", "data_files": "en-et-EE.tsv"}, {"config_name": "en-pa_IN", "data_files": "en-pa_IN.tsv"}, {"config_name": "en_US-nn", "data_files": "en_US-nn.tsv"}, {"config_name": "en_GB-xh", "data_files": "en_GB-xh.tsv"}, {"config_name": "en_devel-sv", "data_files": "en_devel-sv.tsv"}, {"config_name": "en-ru-rRU", "data_files": "en-ru-rRU.tsv"}, {"config_name": "en_US-hr", "data_files": "en_US-hr.tsv"}, {"config_name": "en-sr_Latn", "data_files": "en-sr_Latn.tsv"}, {"config_name": "en_GB-uk", "data_files": "en_GB-uk.tsv"}, {"config_name": "en_GB-ee", "data_files": "en_GB-ee.tsv"}, {"config_name": "en_devel-ta", "data_files": "en_devel-ta.tsv"}, {"config_name": "en_US-hu-HU", "data_files": "en_US-hu-HU.tsv"}, {"config_name": "en_GB-ak", "data_files": "en_GB-ak.tsv"}, {"config_name": "en_US-ia", "data_files": "en_US-ia.tsv"}, {"config_name": "en_UK-it_IT", "data_files": "en_UK-it_IT.tsv"}, {"config_name": "en-ru", "data_files": "en-ru.tsv"}, {"config_name": "en_US-es-ar", "data_files": "en_US-es-ar.tsv"}, {"config_name": "en_US-lo", "data_files": "en_US-lo.tsv"}, {"config_name": "en-ur-PK", "data_files": "en-ur-PK.tsv"}, {"config_name": "en_devel-nb_NO", "data_files": "en_devel-nb_NO.tsv"}, {"config_name": "en_GB-es_ES", "data_files": "en_GB-es_ES.tsv"}, {"config_name": "en_GB-ast", "data_files": "en_GB-ast.tsv"}, {"config_name": "en-hr-HR", "data_files": "en-hr-HR.tsv"}, {"config_name": "en-fr@informal", "data_files": "[email protected]"}, {"config_name": "en-es_ar", "data_files": "en-es_ar.tsv"}, {"config_name": "en-ms_MY", "data_files": "en-ms_MY.tsv"}, {"config_name": "en-el_GR", "data_files": "en-el_GR.tsv"}, {"config_name": "en_devel-ka", "data_files": "en_devel-ka.tsv"}, {"config_name": "en-fr-FR", "data_files": "en-fr-FR.tsv"}, {"config_name": "en_US-kk", "data_files": "en_US-kk.tsv"}, {"config_name": "es-ko", "data_files": "es-ko.tsv"}, {"config_name": "en-fr_AG", "data_files": "en-fr_AG.tsv"}, {"config_name": "en-zh-tw", "data_files": "en-zh-tw.tsv"}, {"config_name": "en-BrazilianPortuguese", "data_files": "en-BrazilianPortuguese.tsv"}, {"config_name": "en_GB-am", "data_files": "en_GB-am.tsv"}, {"config_name": "en-tam", "data_files": "en-tam.tsv"}, {"config_name": "en_US-af", "data_files": "en_US-af.tsv"}, {"config_name": "en_US-is", "data_files": "en_US-is.tsv"}, {"config_name": "en_GB-en_US", "data_files": "en_GB-en_US.tsv"}, {"config_name": "en-az", "data_files": "en-az.tsv"}, {"config_name": "en-en@pirate", "data_files": "[email protected]"}, {"config_name": "en_GB-fil", "data_files": "en_GB-fil.tsv"}, {"config_name": "en_US-pl_PL", "data_files": "en_US-pl_PL.tsv"}, {"config_name": "en_US-sl", "data_files": "en_US-sl.tsv"}, {"config_name": "en_US-nl", "data_files": "en_US-nl.tsv"}, {"config_name": "es-it", "data_files": "es-it.tsv"}, {"config_name": "en_GB-bar", "data_files": "en_GB-bar.tsv"}, {"config_name": "it-nb_NO", "data_files": "it-nb_NO.tsv"}, {"config_name": "eo-it", "data_files": "eo-it.tsv"}, {"config_name": "en_US-yue", "data_files": "en_US-yue.tsv"}, {"config_name": "en-glk", "data_files": "en-glk.tsv"}, {"config_name": "en-fi_FI", "data_files": "en-fi_FI.tsv"}, {"config_name": "es-cs", "data_files": "es-cs.tsv"}, {"config_name": "en_GB-pt_BR", "data_files": "en_GB-pt_BR.tsv"}, {"config_name": "en_GB-zgh", "data_files": "en_GB-zgh.tsv"}, {"config_name": "en_US-nl-BE", "data_files": "en_US-nl-BE.tsv"}, {"config_name": "en-ru-rCH", "data_files": "en-ru-rCH.tsv"}, {"config_name": "en-sr_CS", "data_files": "en-sr_CS.tsv"}, {"config_name": "en-ur", "data_files": "en-ur.tsv"}, {"config_name": "en_GB-th", "data_files": "en_GB-th.tsv"}, {"config_name": "en_US-id_ID", "data_files": "en_US-id_ID.tsv"}, {"config_name": "en_US-be_BY", "data_files": "en_US-be_BY.tsv"}, {"config_name": "en_devel-es_us", "data_files": "en_devel-es_us.tsv"}, {"config_name": "en-fr_CA", "data_files": "en-fr_CA.tsv"}, {"config_name": "en_GB-en", "data_files": "en_GB-en.tsv"}, {"config_name": "en_US-sk", "data_files": "en_US-sk.tsv"}, {"config_name": "en-uz-Latn", "data_files": "en-uz-Latn.tsv"}, {"config_name": "en_devel-eu", "data_files": "en_devel-eu.tsv"}, {"config_name": "en_GB-is_IS", "data_files": "en_GB-is_IS.tsv"}, {"config_name": "sl-en", "data_files": "sl-en.tsv"}, {"config_name": "en-ja_JA", "data_files": "en-ja_JA.tsv"}, {"config_name": "en-bn-BD", "data_files": "en-bn-BD.tsv"}, {"config_name": "fr-de", "data_files": "fr-de.tsv"}, {"config_name": "en-sr_SP", "data_files": "en-sr_SP.tsv"}, {"config_name": "en-nb-no", "data_files": "en-nb-no.tsv"}, {"config_name": "fr-nb_NO", "data_files": "fr-nb_NO.tsv"}, {"config_name": "en_US-lb", "data_files": "en_US-lb.tsv"}, {"config_name": "en-zh_hant", "data_files": "en-zh_hant.tsv"}, {"config_name": "en-be", "data_files": "en-be.tsv"}, {"config_name": "en_US-si", "data_files": "en_US-si.tsv"}, {"config_name": "en-ltg", "data_files": "en-ltg.tsv"}, {"config_name": "en-es_cl", "data_files": "en-es_cl.tsv"}, {"config_name": "en_US-gu", "data_files": "en_US-gu.tsv"}, {"config_name": "en-lb_LU", "data_files": "en-lb_LU.tsv"}, {"config_name": "en-ain", "data_files": "en-ain.tsv"}, {"config_name": "en-de", "data_files": "en-de.tsv"}, {"config_name": "en-es", "data_files": "en-es.tsv"}, {"config_name": "en-belarusian", "data_files": "en-belarusian.tsv"}, {"config_name": "en-kok", "data_files": "en-kok.tsv"}, {"config_name": "nl-fr", "data_files": "nl-fr.tsv"}, {"config_name": "en-ar_SA", "data_files": "en-ar_SA.tsv"}, {"config_name": "en-tk", "data_files": "en-tk.tsv"}, {"config_name": "en-kab", "data_files": "en-kab.tsv"}, {"config_name": "en-or-rIN", "data_files": "en-or-rIN.tsv"}, {"config_name": "en-ja-KS", "data_files": "en-ja-KS.tsv"}, {"config_name": "en-en-Shaw", "data_files": "en-en-Shaw.tsv"}, {"config_name": "en_GB-lo", "data_files": "en_GB-lo.tsv"}, {"config_name": "en_GB-gl_ES", "data_files": "en_GB-gl_ES.tsv"}, {"config_name": "en-sd", "data_files": "en-sd.tsv"}, {"config_name": "en_devel-es_ar", "data_files": "en_devel-es_ar.tsv"}, {"config_name": "en-he-il", "data_files": "en-he-il.tsv"}, {"config_name": "en_GB-zh_TW", "data_files": "en_GB-zh_TW.tsv"}, {"config_name": "en-cs_cz", "data_files": "en-cs_cz.tsv"}, {"config_name": "en_GB-mn", "data_files": "en_GB-mn.tsv"}, {"config_name": "en_US-jv", "data_files": "en_US-jv.tsv"}, {"config_name": "eo-nl", "data_files": "eo-nl.tsv"}, {"config_name": "en-zh_cn", "data_files": "en-zh_cn.tsv"}, {"config_name": "en-he_IL", "data_files": "en-he_IL.tsv"}, {"config_name": "en-IT", "data_files": "en-IT.tsv"}, {"config_name": "en-ja", "data_files": "en-ja.tsv"}, {"config_name": "en_US-fr-ca", "data_files": "en_US-fr-ca.tsv"}, {"config_name": "en-bqi", "data_files": "en-bqi.tsv"}, {"config_name": "en-ro-rRO", "data_files": "en-ro-rRO.tsv"}, {"config_name": "en-krl", "data_files": "en-krl.tsv"}, {"config_name": "en_US-tr_TR", "data_files": "en_US-tr_TR.tsv"}, {"config_name": "pl-lt", "data_files": "pl-lt.tsv"}, {"config_name": "en-zh_Hant_HK", "data_files": "en-zh_Hant_HK.tsv"}, {"config_name": "en_GB-sv_SE", "data_files": "en_GB-sv_SE.tsv"}, {"config_name": "en_US-pt-br", "data_files": "en_US-pt-br.tsv"}, {"config_name": "en-id-ID", "data_files": "en-id-ID.tsv"}, {"config_name": "en-fu", "data_files": "en-fu.tsv"}, {"config_name": "en-French", "data_files": "en-French.tsv"}, {"config_name": "eo-zh", "data_files": "eo-zh.tsv"}, {"config_name": "en-v20", "data_files": "en-v20.tsv"}, {"config_name": "en-iw-IL", "data_files": "en-iw-IL.tsv"}, {"config_name": "en_GB-af", "data_files": "en_GB-af.tsv"}, {"config_name": "en_GB-el", "data_files": "en_GB-el.tsv"}, {"config_name": "en-pa-IN", "data_files": "en-pa-IN.tsv"}, {"config_name": "en_devel-es_ve", "data_files": "en_devel-es_ve.tsv"}, {"config_name": "und-nb_NO", "data_files": "und-nb_NO.tsv"}, {"config_name": "en-lo", "data_files": "en-lo.tsv"}, {"config_name": "en-ar", "data_files": "en-ar.tsv"}, {"config_name": "en-b+zh+HANS+CN", "data_files": "en-b+zh+HANS+CN.tsv"}, {"config_name": "en_GB-byn", "data_files": "en_GB-byn.tsv"}, {"config_name": "en-en-rXC", "data_files": "en-en-rXC.tsv"}, {"config_name": "zh_Hant-nb_NO", "data_files": "zh_Hant-nb_NO.tsv"}, {"config_name": "en-fr", "data_files": "en-fr.tsv"}, {"config_name": "en-zh_HANT", "data_files": "en-zh_HANT.tsv"}, {"config_name": "en_US-fa-IR", "data_files": "en_US-fa-IR.tsv"}, {"config_name": "en_GB-vi", "data_files": "en_GB-vi.tsv"}, {"config_name": "en-Spanish", "data_files": "en-Spanish.tsv"}, {"config_name": "en-am_ET", "data_files": "en-am_ET.tsv"}, {"config_name": "en_devel-bn", "data_files": "en_devel-bn.tsv"}, {"config_name": "en-zh-cn", "data_files": "en-zh-cn.tsv"}, {"config_name": "en-tr-rTR", "data_files": "en-tr-rTR.tsv"}, {"config_name": "fr-cs", "data_files": "fr-cs.tsv"}, {"config_name": "en_US-nl-rBE", "data_files": "en_US-nl-rBE.tsv"}, {"config_name": "es-en", "data_files": "es-en.tsv"}, {"config_name": "en-sr@Cyrl", "data_files": "[email protected]"}, {"config_name": "fr-eu", "data_files": "fr-eu.tsv"}, {"config_name": "en_US-pl", "data_files": "en_US-pl.tsv"}, {"config_name": "en_US-nan", "data_files": "en_US-nan.tsv"}, {"config_name": "en_devel-pt-rBR", "data_files": "en_devel-pt-rBR.tsv"}, {"config_name": "en-sr_lat", "data_files": "en-sr_lat.tsv"}, {"config_name": "en_devel-no", "data_files": "en_devel-no.tsv"}, {"config_name": "pl-de", "data_files": "pl-de.tsv"}, {"config_name": "en-tlh", "data_files": "en-tlh.tsv"}, {"config_name": "en_US-cs_CZ", "data_files": "en_US-cs_CZ.tsv"}, {"config_name": "eo-pl", "data_files": "eo-pl.tsv"}, {"config_name": "en_devel-gl", "data_files": "en_devel-gl.tsv"}, {"config_name": "en-fi-FI", "data_files": "en-fi-FI.tsv"}, {"config_name": "en_US-ca_CA", "data_files": "en_US-ca_CA.tsv"}, {"config_name": "en_US-nb", "data_files": "en_US-nb.tsv"}, {"config_name": "en-is-IS", "data_files": "en-is-IS.tsv"}, {"config_name": "en_GB-io", "data_files": "en_GB-io.tsv"}, {"config_name": "en-UK", "data_files": "en-UK.tsv"}, {"config_name": "en-pt-pt", "data_files": "en-pt-pt.tsv"}, {"config_name": "en-fil", "data_files": "en-fil.tsv"}, {"config_name": "en-mi", "data_files": "en-mi.tsv"}, {"config_name": "en-sr-Cyrl", "data_files": "en-sr-Cyrl.tsv"}, {"config_name": "en_devel-hi", "data_files": "en_devel-hi.tsv"}, {"config_name": "en-nb-NB", "data_files": "en-nb-NB.tsv"}, {"config_name": "en-mnc", "data_files": "en-mnc.tsv"}, {"config_name": "en-mk", "data_files": "en-mk.tsv"}, {"config_name": "en-hrx", "data_files": "en-hrx.tsv"}, {"config_name": "en-ar_MA", "data_files": "en-ar_MA.tsv"}, {"config_name": "en_devel-es", "data_files": "en_devel-es.tsv"}, {"config_name": "en_GB-zh-rCN", "data_files": "en_GB-zh-rCN.tsv"}, {"config_name": "en-sa", "data_files": "en-sa.tsv"}, {"config_name": "en-bs", "data_files": "en-bs.tsv"}, {"config_name": "en_GB-tg", "data_files": "en_GB-tg.tsv"}, {"config_name": "en-si-LK", "data_files": "en-si-LK.tsv"}, {"config_name": "en-lt-LT", "data_files": "en-lt-LT.tsv"}, {"config_name": "en-hi", "data_files": "en-hi.tsv"}, {"config_name": "en-hu_hu", "data_files": "en-hu_hu.tsv"}, {"config_name": "en-mk_MK", "data_files": "en-mk_MK.tsv"}, {"config_name": "en_GB-de_DE", "data_files": "en_GB-de_DE.tsv"}, {"config_name": "messages-eo", "data_files": "messages-eo.tsv"}, {"config_name": "en-ku_IQ", "data_files": "en-ku_IQ.tsv"}, {"config_name": "en-rcf", "data_files": "en-rcf.tsv"}, {"config_name": "en-uz", "data_files": "en-uz.tsv"}, {"config_name": "en-by_lat", "data_files": "en-by_lat.tsv"}, {"config_name": "ia-nb_NO", "data_files": "ia-nb_NO.tsv"}, {"config_name": "messages-ko", "data_files": "messages-ko.tsv"}, {"config_name": "en_US-pt-rBR", "data_files": "en_US-pt-rBR.tsv"}, {"config_name": "en_GB-zu", "data_files": "en_GB-zu.tsv"}, {"config_name": "es-hr", "data_files": "es-hr.tsv"}, {"config_name": "en_devel-th", "data_files": "en_devel-th.tsv"}, {"config_name": "en-af", "data_files": "en-af.tsv"}, {"config_name": "en-ms-MY", "data_files": "en-ms-MY.tsv"}, {"config_name": "en-sr-Latn-RS", "data_files": "en-sr-Latn-RS.tsv"}, {"config_name": "en-de-ZH", "data_files": "en-de-ZH.tsv"}, {"config_name": "en-b+sr+Latn", "data_files": "en-b+sr+Latn.tsv"}, {"config_name": "en-cn", "data_files": "en-cn.tsv"}, {"config_name": "de-zh_Hans", "data_files": "de-zh_Hans.tsv"}, {"config_name": "en_devel-gu", "data_files": "en_devel-gu.tsv"}, {"config_name": "en_US-et_EE", "data_files": "en_US-et_EE.tsv"}, {"config_name": "en-und", "data_files": "en-und.tsv"}, {"config_name": "en_devel-es_ni", "data_files": "en_devel-es_ni.tsv"}, {"config_name": "en-en-rNZ", "data_files": "en-en-rNZ.tsv"}, {"config_name": "pl-fr", "data_files": "pl-fr.tsv"}, {"config_name": "de-es", "data_files": "de-es.tsv"}, {"config_name": "en-pt_br", "data_files": "en-pt_br.tsv"}, {"config_name": "en-gug", "data_files": "en-gug.tsv"}, {"config_name": "fr-fr", "data_files": "fr-fr.tsv"}, {"config_name": "en-fr-rFR", "data_files": "en-fr-rFR.tsv"}, {"config_name": "en-dsb", "data_files": "en-dsb.tsv"}, {"config_name": "en-tr-TR", "data_files": "en-tr-TR.tsv"}, {"config_name": "en-tw", "data_files": "en-tw.tsv"}, {"config_name": "en-bs_Latn", "data_files": "en-bs_Latn.tsv"}, {"config_name": "en_GB-hi", "data_files": "en_GB-hi.tsv"}, {"config_name": "en-norwegian", "data_files": "en-norwegian.tsv"}, {"config_name": "en-zh_Latn_pinyin", "data_files": "en-zh_Latn_pinyin.tsv"}, {"config_name": "en_US-es-mx", "data_files": "en_US-es-mx.tsv"}, {"config_name": "en_GB-nl_NL", "data_files": "en_GB-nl_NL.tsv"}, {"config_name": "es-bn", "data_files": "es-bn.tsv"}, {"config_name": "en-peo", "data_files": "en-peo.tsv"}, {"config_name": "en-de_LU", "data_files": "en-de_LU.tsv"}, {"config_name": "en-mni", "data_files": "en-mni.tsv"}, {"config_name": "en_GB-jam", "data_files": "en_GB-jam.tsv"}, {"config_name": "en-sr_cyr", "data_files": "en-sr_cyr.tsv"}, {"config_name": "en-ro-RO", "data_files": "en-ro-RO.tsv"}, {"config_name": "en-doi", "data_files": "en-doi.tsv"}, {"config_name": "en_GB-en-US", "data_files": "en_GB-en-US.tsv"}, {"config_name": "en-he", "data_files": "en-he.tsv"}, {"config_name": "en-et", "data_files": "en-et.tsv"}, {"config_name": "en-tl_PH", "data_files": "en-tl_PH.tsv"}, {"config_name": "en-sr-Cyrl-RS", "data_files": "en-sr-Cyrl-RS.tsv"}, {"config_name": "en-Dutch", "data_files": "en-Dutch.tsv"}, {"config_name": "en-uz_UZ", "data_files": "en-uz_UZ.tsv"}, {"config_name": "en-ur-rIN", "data_files": "en-ur-rIN.tsv"}, {"config_name": "en-kn", "data_files": "en-kn.tsv"}, {"config_name": "en-trv", "data_files": "en-trv.tsv"}, {"config_name": "en_US-ms_MY", "data_files": "en_US-ms_MY.tsv"}, {"config_name": "en-de-rFO", "data_files": "en-de-rFO.tsv"}, {"config_name": "en-zh-CN", "data_files": "en-zh-CN.tsv"}, {"config_name": "ru-de", "data_files": "ru-de.tsv"}, {"config_name": "en-pt_BR", "data_files": "en-pt_BR.tsv"}, {"config_name": "en_GB-ms_MY", "data_files": "en_GB-ms_MY.tsv"}, {"config_name": "en_GB-tr", "data_files": "en_GB-tr.tsv"}, {"config_name": "en-bn_IN", "data_files": "en-bn_IN.tsv"}, {"config_name": "en_GB-pt", "data_files": "en_GB-pt.tsv"}, {"config_name": "en_GB-wa", "data_files": "en_GB-wa.tsv"}, {"config_name": "en_US-te", "data_files": "en_US-te.tsv"}, {"config_name": "en-da-rDK", "data_files": "en-da-rDK.tsv"}, {"config_name": "en_US-zh_CN", "data_files": "en_US-zh_CN.tsv"}, {"config_name": "en_US-az", "data_files": "en_US-az.tsv"}, {"config_name": "en-sn", "data_files": "en-sn.tsv"}, {"config_name": "en_devel-zh_Hant", "data_files": "en_devel-zh_Hant.tsv"}, {"config_name": "en-sw", "data_files": "en-sw.tsv"}, {"config_name": "en-fr_fr", "data_files": "en-fr_fr.tsv"}, {"config_name": "en_GB-mhr", "data_files": "en_GB-mhr.tsv"}, {"config_name": "sv-se", "data_files": "sv-se.tsv"}, {"config_name": "en-mn", "data_files": "en-mn.tsv"}, {"config_name": "en-gl", "data_files": "en-gl.tsv"}, {"config_name": "en_GB-is", "data_files": "en_GB-is.tsv"}, {"config_name": "en-nl-NL", "data_files": "en-nl-NL.tsv"}, {"config_name": "dev-fa", "data_files": "dev-fa.tsv"}, {"config_name": "en-frp", "data_files": "en-frp.tsv"}, {"config_name": "en_GB-it", "data_files": "en_GB-it.tsv"}, {"config_name": "en_US-ja-JP", "data_files": "en_US-ja-JP.tsv"}, {"config_name": "en_US-vi_VN", "data_files": "en_US-vi_VN.tsv"}, {"config_name": "en-zu", "data_files": "en-zu.tsv"}, {"config_name": "en_US-zh_HK", "data_files": "en_US-zh_HK.tsv"}, {"config_name": "en_UK-nb_NO", "data_files": "en_UK-nb_NO.tsv"}, {"config_name": "en_GB-eo", "data_files": "en_GB-eo.tsv"}, {"config_name": "en-ar_YE", "data_files": "en-ar_YE.tsv"}, {"config_name": "messages-pt", "data_files": "messages-pt.tsv"}, {"config_name": "en_devel-hr", "data_files": "en_devel-hr.tsv"}, {"config_name": "ia-en", "data_files": "ia-en.tsv"}, {"config_name": "en-sr", "data_files": "en-sr.tsv"}, {"config_name": "en_US-el_GR", "data_files": "en_US-el_GR.tsv"}, {"config_name": "en_US-bg", "data_files": "en_US-bg.tsv"}, {"config_name": "en-be@latin", "data_files": "[email protected]"}, {"config_name": "en_US-zh_Hant", "data_files": "en_US-zh_Hant.tsv"}, {"config_name": "eo-fr", "data_files": "eo-fr.tsv"}, {"config_name": "en-uk_UA", "data_files": "en-uk_UA.tsv"}, {"config_name": "en_US-pt-BR", "data_files": "en_US-pt-BR.tsv"}, {"config_name": "nl-ko", "data_files": "nl-ko.tsv"}, {"config_name": "en-sl-SI", "data_files": "en-sl-SI.tsv"}, {"config_name": "en-to", "data_files": "en-to.tsv"}, {"config_name": "en_GB-ne", "data_files": "en_GB-ne.tsv"}, {"config_name": "en-la", "data_files": "en-la.tsv"}, {"config_name": "ru-ua", "data_files": "ru-ua.tsv"}, {"config_name": "en_GB-ia", "data_files": "en_GB-ia.tsv"}, {"config_name": "en_US-bn_BD", "data_files": "en_US-bn_BD.tsv"}, {"config_name": "en-zh_Hant", "data_files": "en-zh_Hant.tsv"}, {"config_name": "en_devel-nl_BE", "data_files": "en_devel-nl_BE.tsv"}, {"config_name": "en-id", "data_files": "en-id.tsv"}, {"config_name": "en_GB-pa", "data_files": "en_GB-pa.tsv"}, {"config_name": "en-gl_ES", "data_files": "en-gl_ES.tsv"}, {"config_name": "en-vi", "data_files": "en-vi.tsv"}, {"config_name": "fr-es", "data_files": "fr-es.tsv"}, {"config_name": "en-udm", "data_files": "en-udm.tsv"}, {"config_name": "en-es-rUS", "data_files": "en-es-rUS.tsv"}, {"config_name": "en-b+tok", "data_files": "en-b+tok.tsv"}, {"config_name": "it-fr_FR", "data_files": "it-fr_FR.tsv"}, {"config_name": "und-nl", "data_files": "und-nl.tsv"}, {"config_name": "en-pt_pt", "data_files": "en-pt_pt.tsv"}, {"config_name": "en-es_419", "data_files": "en-es_419.tsv"}, {"config_name": "en-jbo", "data_files": "en-jbo.tsv"}, {"config_name": "en_GB-nb-rNO", "data_files": "en_GB-nb-rNO.tsv"}, {"config_name": "en_GB-nl", "data_files": "en_GB-nl.tsv"}, {"config_name": "en-gl-ES", "data_files": "en-gl-ES.tsv"}, {"config_name": "en-de_AT", "data_files": "en-de_AT.tsv"}, {"config_name": "en-mk-MK", "data_files": "en-mk-MK.tsv"}, {"config_name": "en_GB-bg", "data_files": "en_GB-bg.tsv"}, {"config_name": "en_US-sc", "data_files": "en_US-sc.tsv"}, {"config_name": "en_US-kn", "data_files": "en_US-kn.tsv"}, {"config_name": "en-cy_GB", "data_files": "en-cy_GB.tsv"}, {"config_name": "en_US-mn", "data_files": "en_US-mn.tsv"}, {"config_name": "de-uk", "data_files": "de-uk.tsv"}, {"config_name": "en_GB-ko", "data_files": "en_GB-ko.tsv"}, {"config_name": "en-nl-rNL", "data_files": "en-nl-rNL.tsv"}, {"config_name": "en_devel-pt_PT", "data_files": "en_devel-pt_PT.tsv"}, {"config_name": "en_US-fi_FI", "data_files": "en_US-fi_FI.tsv"}, {"config_name": "en_devel-vi", "data_files": "en_devel-vi.tsv"}, {"config_name": "en_US-ru", "data_files": "en_US-ru.tsv"}, {"config_name": "en-hne", "data_files": "en-hne.tsv"}, {"config_name": "en-fi", "data_files": "en-fi.tsv"}, {"config_name": "en-ru_RU", "data_files": "en-ru_RU.tsv"}, {"config_name": "en_devel-es_cl", "data_files": "en_devel-es_cl.tsv"}, {"config_name": "de-el", "data_files": "de-el.tsv"}, {"config_name": "en_devel-ro", "data_files": "en_devel-ro.tsv"}, {"config_name": "en_GB-tt", "data_files": "en_GB-tt.tsv"}, {"config_name": "en-eng_GB", "data_files": "en-eng_GB.tsv"}, {"config_name": "en-lt-rLT", "data_files": "en-lt-rLT.tsv"}, {"config_name": "en-ota", "data_files": "en-ota.tsv"}, {"config_name": "en_devel-es_co", "data_files": "en_devel-es_co.tsv"}, {"config_name": "en-russian", "data_files": "en-russian.tsv"}, {"config_name": "en-ar-MA", "data_files": "en-ar-MA.tsv"}, {"config_name": "en-nn", "data_files": "en-nn.tsv"}, {"config_name": "eo-en", "data_files": "eo-en.tsv"}, {"config_name": "en_GB-cv", "data_files": "en_GB-cv.tsv"}, {"config_name": "en_devel-id_ID", "data_files": "en_devel-id_ID.tsv"}, {"config_name": "en_US-nb-NO", "data_files": "en_US-nb-NO.tsv"}, {"config_name": "en-it-rIT", "data_files": "en-it-rIT.tsv"}, {"config_name": "en_US-pl-PL", "data_files": "en_US-pl-PL.tsv"}, {"config_name": "en-ext", "data_files": "en-ext.tsv"}, {"config_name": "en-ko", "data_files": "en-ko.tsv"}, {"config_name": "en-tg", "data_files": "en-tg.tsv"}, {"config_name": "en-ga_IE", "data_files": "en-ga_IE.tsv"}, {"config_name": "en_devel-sr", "data_files": "en_devel-sr.tsv"}, {"config_name": "en-PT", "data_files": "en-PT.tsv"}, {"config_name": "en-sv", "data_files": "en-sv.tsv"}, {"config_name": "en_GB-son", "data_files": "en_GB-son.tsv"}, {"config_name": "en-et_ee", "data_files": "en-et_ee.tsv"}, {"config_name": "en_GB-el_GR", "data_files": "en_GB-el_GR.tsv"}, {"config_name": "en-jp", "data_files": "en-jp.tsv"}, {"config_name": "en-ga-rIE", "data_files": "en-ga-rIE.tsv"}, {"config_name": "sv-en", "data_files": "sv-en.tsv"}, {"config_name": "en_US-ua", "data_files": "en_US-ua.tsv"}, {"config_name": "en-sm", "data_files": "en-sm.tsv"}, {"config_name": "en-nap", "data_files": "en-nap.tsv"}, {"config_name": "en-portuguese", "data_files": "en-portuguese.tsv"}, {"config_name": "en_US-nl-NL", "data_files": "en_US-nl-NL.tsv"}, {"config_name": "en-es_ec", "data_files": "en-es_ec.tsv"}, {"config_name": "en_GB-crh", "data_files": "en_GB-crh.tsv"}, {"config_name": "en-tr_TR", "data_files": "en-tr_TR.tsv"}, {"config_name": "en-sr_RS@latin", "data_files": "[email protected]"}, {"config_name": "en-bg_BG", "data_files": "en-bg_BG.tsv"}, {"config_name": "en-hu", "data_files": "en-hu.tsv"}, {"config_name": "en-es_SV", "data_files": "en-es_SV.tsv"}, {"config_name": "en_GB-rw", "data_files": "en_GB-rw.tsv"}, {"config_name": "en-es_AR", "data_files": "en-es_AR.tsv"}, {"config_name": "en_devel-es_pe", "data_files": "en_devel-es_pe.tsv"}, {"config_name": "en-et-rEE", "data_files": "en-et-rEE.tsv"}, {"config_name": "en-ro-v26", "data_files": "en-ro-v26.tsv"}, {"config_name": "en-ne-NP", "data_files": "en-ne-NP.tsv"}, {"config_name": "en-es-ar", "data_files": "en-es-ar.tsv"}, {"config_name": "en-en_ZA", "data_files": "en-en_ZA.tsv"}, {"config_name": "en_devel-lt", "data_files": "en_devel-lt.tsv"}, {"config_name": "en-eg", "data_files": "en-eg.tsv"}, {"config_name": "zh_Latn-zh_Hans", "data_files": "zh_Latn-zh_Hans.tsv"}, {"config_name": "en_GB-so", "data_files": "en_GB-so.tsv"}, {"config_name": "en-hr-rHR", "data_files": "en-hr-rHR.tsv"}, {"config_name": "en-lt_LT", "data_files": "en-lt_LT.tsv"}, {"config_name": "en-io", "data_files": "en-io.tsv"}, {"config_name": "en-sh-rHR", "data_files": "en-sh-rHR.tsv"}, {"config_name": "en-uk", "data_files": "en-uk.tsv"}, {"config_name": "en_GB-cs-CZ", "data_files": "en_GB-cs-CZ.tsv"}, {"config_name": "en-de-rCH", "data_files": "en-de-rCH.tsv"}, {"config_name": "en-nah", "data_files": "en-nah.tsv"}, {"config_name": "en_devel-tr", "data_files": "en_devel-tr.tsv"}, {"config_name": "en-de-rAT", "data_files": "en-de-rAT.tsv"}, {"config_name": "eo-sv", "data_files": "eo-sv.tsv"}, {"config_name": "en-nb", "data_files": "en-nb.tsv"}, {"config_name": "en_GB-ab", "data_files": "en_GB-ab.tsv"}, {"config_name": "en_US-de-DE", "data_files": "en_US-de-DE.tsv"}, {"config_name": "en-de_alm_x", "data_files": "en-de_alm_x.tsv"}, {"config_name": "en_GB-it-IT", "data_files": "en_GB-it-IT.tsv"}, {"config_name": "en-aa", "data_files": "en-aa.tsv"}, {"config_name": "en_devel-sq", "data_files": "en_devel-sq.tsv"}, {"config_name": "en_devel-en_au", "data_files": "en_devel-en_au.tsv"}, {"config_name": "en-sl", "data_files": "en-sl.tsv"}, {"config_name": "en-sr-rSP", "data_files": "en-sr-rSP.tsv"}, {"config_name": "en-ckb", "data_files": "en-ckb.tsv"}, {"config_name": "en_devel-pt_pt", "data_files": "en_devel-pt_pt.tsv"}, {"config_name": "en_devel-ar", "data_files": "en_devel-ar.tsv"}, {"config_name": "en-nn-NO", "data_files": "en-nn-NO.tsv"}, {"config_name": "es-fr", "data_files": "es-fr.tsv"}, {"config_name": "en-mk-rMK", "data_files": "en-mk-rMK.tsv"}, {"config_name": "en-spanish", "data_files": "en-spanish.tsv"}, {"config_name": "en_GB-ve", "data_files": "en_GB-ve.tsv"}, {"config_name": "en_GB-zh_HK", "data_files": "en_GB-zh_HK.tsv"}, {"config_name": "en_GB-kmr", "data_files": "en_GB-kmr.tsv"}, {"config_name": "en-no_nb", "data_files": "en-no_nb.tsv"}, {"config_name": "en_GB-sq", "data_files": "en_GB-sq.tsv"}, {"config_name": "en_US-ro-RO", "data_files": "en_US-ro-RO.tsv"}, {"config_name": "en-zh-rHK", "data_files": "en-zh-rHK.tsv"}, {"config_name": "en-Russian", "data_files": "en-Russian.tsv"}, {"config_name": "en_GB-ht", "data_files": "en_GB-ht.tsv"}, {"config_name": "en_GB-ug", "data_files": "en_GB-ug.tsv"}, {"config_name": "en-na", "data_files": "en-na.tsv"}, {"config_name": "en_devel-es_gt", "data_files": "en_devel-es_gt.tsv"}, {"config_name": "en-ka-rGE", "data_files": "en-ka-rGE.tsv"}, {"config_name": "en_US-bn-rBD", "data_files": "en_US-bn-rBD.tsv"}, {"config_name": "eo-ro", "data_files": "eo-ro.tsv"}, {"config_name": "en_GB-ko_KR", "data_files": "en_GB-ko_KR.tsv"}, {"config_name": "en-sr@Latn", "data_files": "[email protected]"}, {"config_name": "en-french", "data_files": "en-french.tsv"}, {"config_name": "es-nl", "data_files": "es-nl.tsv"}, {"config_name": "en-georgian", "data_files": "en-georgian.tsv"}, {"config_name": "en_devel-sl", "data_files": "en_devel-sl.tsv"}, {"config_name": "en-jv", "data_files": "en-jv.tsv"}, {"config_name": "en-ur-UR", "data_files": "en-ur-UR.tsv"}, {"config_name": "en-dv", "data_files": "en-dv.tsv"}, {"config_name": "en_US-pt-PT", "data_files": "en_US-pt-PT.tsv"}, {"config_name": "en-ar_LY", "data_files": "en-ar_LY.tsv"}, {"config_name": "en-sv-SE", "data_files": "en-sv-SE.tsv"}, {"config_name": "en-ca_ES@valencia", "data_files": "[email protected]"}, {"config_name": "en_devel-oc", "data_files": "en_devel-oc.tsv"}, {"config_name": "en-th_TH", "data_files": "en-th_TH.tsv"}, {"config_name": "en-de_CH", "data_files": "en-de_CH.tsv"}, {"config_name": "en-ca-valencia", "data_files": "en-ca-valencia.tsv"}, {"config_name": "en-crh", "data_files": "en-crh.tsv"}, {"config_name": "en_US-en@pirate", "data_files": "[email protected]"}, {"config_name": "en-haw", "data_files": "en-haw.tsv"}, {"config_name": "en-sk-rSK", "data_files": "en-sk-rSK.tsv"}, {"config_name": "en-sr@latin", "data_files": "[email protected]"}, {"config_name": "en-jam", "data_files": "en-jam.tsv"}, {"config_name": "en_devel-ko", "data_files": "en_devel-ko.tsv"}, {"config_name": "en_devel-de", "data_files": "en_devel-de.tsv"}, {"config_name": "messages-nb_NO", "data_files": "messages-nb_NO.tsv"}, {"config_name": "en_GB-no", "data_files": "en_GB-no.tsv"}, {"config_name": "en_US-tok", "data_files": "en_US-tok.tsv"}, {"config_name": "en_US-zh_Hans", "data_files": "en_US-zh_Hans.tsv"}, {"config_name": "en-hsb", "data_files": "en-hsb.tsv"}, {"config_name": "en-eo", "data_files": "en-eo.tsv"}, {"config_name": "en-eu_ES", "data_files": "en-eu_ES.tsv"}, {"config_name": "en-ayc", "data_files": "en-ayc.tsv"}, {"config_name": "en-ca", "data_files": "en-ca.tsv"}, {"config_name": "en-fr_LU", "data_files": "en-fr_LU.tsv"}, {"config_name": "en-vi-rVN", "data_files": "en-vi-rVN.tsv"}, {"config_name": "en-pr", "data_files": "en-pr.tsv"}, {"config_name": "en-vls", "data_files": "en-vls.tsv"}, {"config_name": "es-gl", "data_files": "es-gl.tsv"}, {"config_name": "en_GB-nb-NO", "data_files": "en_GB-nb-NO.tsv"}, {"config_name": "en_GB-haw", "data_files": "en_GB-haw.tsv"}, {"config_name": "pt_BR-es", "data_files": "pt_BR-es.tsv"}, {"config_name": "en-nn-rNO", "data_files": "en-nn-rNO.tsv"}, {"config_name": "en_US-zh-tw", "data_files": "en_US-zh-tw.tsv"}, {"config_name": "en-ar-AA", "data_files": "en-ar-AA.tsv"}, {"config_name": "en_GB-fr_FR", "data_files": "en_GB-fr_FR.tsv"}, {"config_name": "en_GB-gez", "data_files": "en_GB-gez.tsv"}, {"config_name": "en-ID", "data_files": "en-ID.tsv"}, {"config_name": "en_GB-oc", "data_files": "en_GB-oc.tsv"}, {"config_name": "es-ia", "data_files": "es-ia.tsv"}, {"config_name": "en_GB-kv", "data_files": "en_GB-kv.tsv"}, {"config_name": "en-es-419", "data_files": "en-es-419.tsv"}, {"config_name": "eo-pt", "data_files": "eo-pt.tsv"}, {"config_name": "it-en_EN", "data_files": "it-en_EN.tsv"}, {"config_name": "en-czech", "data_files": "en-czech.tsv"}, {"config_name": "eo-cs", "data_files": "eo-cs.tsv"}, {"config_name": "en_devel-es_sv", "data_files": "en_devel-es_sv.tsv"}, {"config_name": "en-es_CL", "data_files": "en-es_CL.tsv"}, {"config_name": "en-si", "data_files": "en-si.tsv"}, {"config_name": "en-cs", "data_files": "en-cs.tsv"}, {"config_name": "en-sv_SE", "data_files": "en-sv_SE.tsv"}, {"config_name": "en_US-ne_NP", "data_files": "en_US-ne_NP.tsv"}, {"config_name": "en_GB-fy", "data_files": "en_GB-fy.tsv"}, {"config_name": "en_devel-en-rGB", "data_files": "en_devel-en-rGB.tsv"}, {"config_name": "en_GB-sr", "data_files": "en_GB-sr.tsv"}, {"config_name": "en-es-rPE", "data_files": "en-es-rPE.tsv"}, {"config_name": "en_US-en", "data_files": "en_US-en.tsv"}, {"config_name": "en_GB-eu", "data_files": "en_GB-eu.tsv"}, {"config_name": "en_GB-nb_NO", "data_files": "en_GB-nb_NO.tsv"}, {"config_name": "en-uz-UZ", "data_files": "en-uz-UZ.tsv"}, {"config_name": "eo-ko", "data_files": "eo-ko.tsv"}, {"config_name": "en-lb", "data_files": "en-lb.tsv"}, {"config_name": "en-lg", "data_files": "en-lg.tsv"}, {"config_name": "en-Esperanto", "data_files": "en-Esperanto.tsv"}, {"config_name": "en-ar-SA", "data_files": "en-ar-SA.tsv"}, {"config_name": "en_GB-ro_RO", "data_files": "en_GB-ro_RO.tsv"}, {"config_name": "en-cmn", "data_files": "en-cmn.tsv"}, {"config_name": "en-mni@bengali", "data_files": "[email protected]"}, {"config_name": "en-ks", "data_files": "en-ks.tsv"}, {"config_name": "en_US-pt_BR", "data_files": "en_US-pt_BR.tsv"}, {"config_name": "ru-nb_NO", "data_files": "ru-nb_NO.tsv"}, {"config_name": "en-fr-rCA", "data_files": "en-fr-rCA.tsv"}, {"config_name": "en-kn-rIN", "data_files": "en-kn-rIN.tsv"}, {"config_name": "en_devel-sq_al", "data_files": "en_devel-sq_al.tsv"}, {"config_name": "en_US-nb_NO", "data_files": "en_US-nb_NO.tsv"}, {"config_name": "en-ce", "data_files": "en-ce.tsv"}, {"config_name": "en_US-ga", "data_files": "en_US-ga.tsv"}, {"config_name": "en-en-rZA", "data_files": "en-en-rZA.tsv"}, {"config_name": "en-rue", "data_files": "en-rue.tsv"}, {"config_name": "en-es_CO", "data_files": "en-es_CO.tsv"}, {"config_name": "en-es-es", "data_files": "en-es-es.tsv"}, {"config_name": "en-fa", "data_files": "en-fa.tsv"}, {"config_name": "en-de_DE", "data_files": "en-de_DE.tsv"}, {"config_name": "en-kg", "data_files": "en-kg.tsv"}, {"config_name": "en_US-es_ES", "data_files": "en_US-es_ES.tsv"}, {"config_name": "en-bg-rBG", "data_files": "en-bg-rBG.tsv"}, {"config_name": "fr-nl", "data_files": "fr-nl.tsv"}, {"config_name": "en_GB-as", "data_files": "en_GB-as.tsv"}, {"config_name": "en-nl", "data_files": "en-nl.tsv"}, {"config_name": "en-ka-GE", "data_files": "en-ka-GE.tsv"}, {"config_name": "en-sah", "data_files": "en-sah.tsv"}, {"config_name": "en_US-ur", "data_files": "en_US-ur.tsv"}, {"config_name": "und-si", "data_files": "und-si.tsv"}, {"config_name": "en_devel-en_ca", "data_files": "en_devel-en_ca.tsv"}, {"config_name": "en-cs-CZ", "data_files": "en-cs-CZ.tsv"}, {"config_name": "en-de_DIVEO", "data_files": "en-de_DIVEO.tsv"}, {"config_name": "en-es-PE", "data_files": "en-es-PE.tsv"}, {"config_name": "en-nb-rNO", "data_files": "en-nb-rNO.tsv"}, {"config_name": "en_GB-in", "data_files": "en_GB-in.tsv"}, {"config_name": "en_US-grc", "data_files": "en_US-grc.tsv"}, {"config_name": "en_GB-ast_ES", "data_files": "en_GB-ast_ES.tsv"}, {"config_name": "nb_NO-en", "data_files": "nb_NO-en.tsv"}, {"config_name": "en_devel-zh-cn", "data_files": "en_devel-zh-cn.tsv"}, {"config_name": "en_US-th", "data_files": "en_US-th.tsv"}, {"config_name": "en_devel-fa", "data_files": "en_devel-fa.tsv"}, {"config_name": "en_devel-es_py", "data_files": "en_devel-es_py.tsv"}, {"config_name": "en-prg", "data_files": "en-prg.tsv"}, {"config_name": "en_GB-uk_UA", "data_files": "en_GB-uk_UA.tsv"}, {"config_name": "en-gn", "data_files": "en-gn.tsv"}, {"config_name": "en-sat", "data_files": "en-sat.tsv"}, {"config_name": "en-jpn_JP", "data_files": "en-jpn_JP.tsv"}, {"config_name": "en-ko-rKR", "data_files": "en-ko-rKR.tsv"}, {"config_name": "en-anp", "data_files": "en-anp.tsv"}, {"config_name": "en-si_LK", "data_files": "en-si_LK.tsv"}, {"config_name": "en_GB-gn", "data_files": "en_GB-gn.tsv"}, {"config_name": "en-kn_IN", "data_files": "en-kn_IN.tsv"}, {"config_name": "en-b+jbo", "data_files": "en-b+jbo.tsv"}, {"config_name": "en-me", "data_files": "en-me.tsv"}, {"config_name": "en-lfn", "data_files": "en-lfn.tsv"}, {"config_name": "en-cz", "data_files": "en-cz.tsv"}, {"config_name": "en_GB-iu", "data_files": "en_GB-iu.tsv"}, {"config_name": "en-uz@cyrillic", "data_files": "[email protected]"}, {"config_name": "en_US-es-419", "data_files": "en_US-es-419.tsv"}, {"config_name": "en_US-ug", "data_files": "en_US-ug.tsv"}, {"config_name": "es-ext", "data_files": "es-ext.tsv"}, {"config_name": "en_GB-pa_PK", "data_files": "en_GB-pa_PK.tsv"}, {"config_name": "en-ast", "data_files": "en-ast.tsv"}, {"config_name": "en_US-no", "data_files": "en_US-no.tsv"}, {"config_name": "en-afh", "data_files": "en-afh.tsv"}, {"config_name": "en-fi-rFI", "data_files": "en-fi-rFI.tsv"}, {"config_name": "en-ar-rLY", "data_files": "en-ar-rLY.tsv"}, {"config_name": "en_devel-pt_br", "data_files": "en_devel-pt_br.tsv"}, {"config_name": "en-ca_ES", "data_files": "en-ca_ES.tsv"}, {"config_name": "fr-ru", "data_files": "fr-ru.tsv"}, {"config_name": "en-eo_XX", "data_files": "en-eo_XX.tsv"}, {"config_name": "en_US-tl", "data_files": "en_US-tl.tsv"}, {"config_name": "en_GB-gl", "data_files": "en_GB-gl.tsv"}, {"config_name": "en_UK-es_ES", "data_files": "en_UK-es_ES.tsv"}, {"config_name": "en-be-rBY", "data_files": "en-be-rBY.tsv"}, {"config_name": "en-b+hsb", "data_files": "en-b+hsb.tsv"}, {"config_name": "en_GB-ps", "data_files": "en_GB-ps.tsv"}, {"config_name": "en-hi-IN", "data_files": "en-hi-IN.tsv"}, {"config_name": "en-PL", "data_files": "en-PL.tsv"}, {"config_name": "en_GB-dv", "data_files": "en_GB-dv.tsv"}, {"config_name": "en_US-sv", "data_files": "en_US-sv.tsv"}, {"config_name": "en_US-en_AU", "data_files": "en_US-en_AU.tsv"}, {"config_name": "en_GB-frp", "data_files": "en_GB-frp.tsv"}, {"config_name": "en_GB-sv-SE", "data_files": "en_GB-sv-SE.tsv"}, {"config_name": "en-ZH-rCN", "data_files": "en-ZH-rCN.tsv"}, {"config_name": "en-sq", "data_files": "en-sq.tsv"}, {"config_name": "en-README_FA", "data_files": "en-README_FA.tsv"}, {"config_name": "en_devel-ca", "data_files": "en_devel-ca.tsv"}, {"config_name": "en_UK-fr_FR", "data_files": "en_UK-fr_FR.tsv"}, {"config_name": "en-zh_Hans", "data_files": "en-zh_Hans.tsv"}, {"config_name": "en-ar_DZ", "data_files": "en-ar_DZ.tsv"}, {"config_name": "en-ml", "data_files": "en-ml.tsv"}, {"config_name": "en-zh-rTW", "data_files": "en-zh-rTW.tsv"}, {"config_name": "en-uz-Cyrl", "data_files": "en-uz-Cyrl.tsv"}, {"config_name": "messages-it", "data_files": "messages-it.tsv"}, {"config_name": "en_devel-ru", "data_files": "en_devel-ru.tsv"}, {"config_name": "en-es-MX", "data_files": "en-es-MX.tsv"}, {"config_name": "en_US-zh-Hant-HK", "data_files": "en_US-zh-Hant-HK.tsv"}, {"config_name": "en-de@formal", "data_files": "[email protected]"}, {"config_name": "en_US-ar-AA", "data_files": "en_US-ar-AA.tsv"}, {"config_name": "en-en_IE", "data_files": "en-en_IE.tsv"}, {"config_name": "en_US-de", "data_files": "en_US-de.tsv"}, {"config_name": "en-eu", "data_files": "en-eu.tsv"}, {"config_name": "en-tl", "data_files": "en-tl.tsv"}, {"config_name": "ia-ru", "data_files": "ia-ru.tsv"}, {"config_name": "en_GB-my", "data_files": "en_GB-my.tsv"}, {"config_name": "en-Polish", "data_files": "en-Polish.tsv"}, {"config_name": "en_GB-si", "data_files": "en_GB-si.tsv"}, {"config_name": "eo-nb_NO", "data_files": "eo-nb_NO.tsv"}, {"config_name": "en_devel-iw", "data_files": "en_devel-iw.tsv"}, {"config_name": "en_GB-pt_PT", "data_files": "en_GB-pt_PT.tsv"}, {"config_name": "en_GB-tt@iqtelif", "data_files": "[email protected]"}, {"config_name": "en-sk", "data_files": "en-sk.tsv"}, {"config_name": "es-de", "data_files": "es-de.tsv"}, {"config_name": "en-enm", "data_files": "en-enm.tsv"}, {"config_name": "en_US-sk-SK", "data_files": "en_US-sk-SK.tsv"}, {"config_name": "en_GB-be", "data_files": "en_GB-be.tsv"}, {"config_name": "nl-en", "data_files": "nl-en.tsv"}, {"config_name": "en_US-sr_RS", "data_files": "en_US-sr_RS.tsv"}, {"config_name": "en_GB-cy", "data_files": "en_GB-cy.tsv"}, {"config_name": "en_devel-es_uy", "data_files": "en_devel-es_uy.tsv"}, {"config_name": "en-fa-AF", "data_files": "en-fa-AF.tsv"}]} | 2024-01-19T21:29:44+00:00 | [] | [
"aa",
"ab",
"ace",
"ach",
"af",
"afh",
"aii",
"ain",
"ajp",
"ak",
"am",
"an",
"ang",
"anp",
"apc",
"ar",
"arn",
"ars",
"as",
"ast",
"ay",
"ayc",
"az",
"azb",
"ba",
"bar",
"bd",
"be",
"bem",
"ber",
"bg",
"bho",
"bm",
"bn",
"bo",
"bp",
"bqi",
"br",
"brx",
"bs",
"bul",
"by",
"ca",
"ce",
"ceb",
"ckb",
"cmn",
"cn",
"cnr",
"co",
"cr",
"crh",
"cs",
"csb",
"cv",
"cy",
"cz",
"da",
"de",
"dev",
"doi",
"dsb",
"dua",
"dum",
"dv",
"dz",
"eg",
"el",
"en",
"eng",
"enm",
"eo",
"es",
"et",
"eu",
"ext",
"fa",
"fi",
"fil",
"fo",
"fr",
"fra",
"frm",
"frp",
"frs",
"fu",
"fur",
"fy",
"ga",
"gb",
"gd",
"gl",
"glk",
"gmh",
"gn",
"gr",
"gsw",
"gu",
"guc",
"gug",
"gum",
"guw",
"gv",
"ha",
"haw",
"he",
"hi",
"hne",
"hr",
"hrx",
"hsb",
"ht",
"hu",
"hy",
"hz",
"ia",
"id",
"ie",
"ig",
"in",
"io",
"is",
"it",
"iw",
"ja",
"jam",
"jbo",
"ji",
"jp",
"jpn",
"jv",
"ka",
"kab",
"kg",
"kk",
"kl",
"km",
"kmr",
"kn",
"ko",
"kok",
"kr",
"krl",
"ks",
"ksh",
"ku",
"kw",
"ky",
"la",
"lb",
"lfn",
"lg",
"li",
"lk",
"ln",
"lo",
"lt",
"ltg",
"lv",
"lzh",
"mai",
"me",
"mg",
"mhr",
"mi",
"mjw",
"mk",
"ml",
"mn",
"mnc",
"mni",
"mnw",
"mo",
"mr",
"ms",
"mt",
"my",
"na",
"nah",
"nan",
"nap",
"nb",
"nds",
"ne",
"nl",
"nn",
"no",
"np",
"nqo",
"ny",
"oc",
"oj",
"om",
"or",
"os",
"ota",
"pa",
"pam",
"pap",
"pbb",
"peo",
"pk",
"pl",
"pms",
"pr",
"prg",
"ps",
"pt",
"pu",
"qt",
"rcf",
"rm",
"ro",
"rom",
"ru",
"rue",
"rw",
"ryu",
"sa",
"sah",
"sai",
"sat",
"sc",
"sco",
"sd",
"sdh",
"se",
"sh",
"shn",
"si",
"sk",
"skr",
"sl",
"sm",
"sma",
"sn",
"so",
"sq",
"sr",
"st",
"su",
"sv",
"sw",
"szl",
"ta",
"tam",
"te",
"tet",
"tg",
"th",
"ti",
"tk",
"tl",
"tlh",
"tn",
"to",
"tok",
"tr",
"trv",
"tt",
"tum",
"tw",
"ty",
"tzm",
"ua",
"udm",
"ug",
"uk",
"und",
"ur",
"us",
"uz",
"vec",
"vi",
"vls",
"wa",
"wae",
"wo",
"xh",
"yi",
"yo",
"yue",
"zgh",
"zh",
"zu"
] | TAGS
#task_categories-translation #task_categories-text2text-generation #annotations_creators-crowdsourced #size_categories-1M<n<10M #language-Afar #language-Abkhazian #language-Achinese #language-Acoli #language-Afrikaans #language-Afrihili #language-Assyrian Neo-Aramaic #language-Ainu (Japan) #language-South Levantine Arabic #language-Akan #language-Amharic #language-Aragonese #language-Old English (ca. 450-1100) #language-Angika #language-Levantine Arabic #language-Arabic #language-Mapudungun #language-Najdi Arabic #language-Assamese #language-Asturian #language-Aymara #language-Southern Aymara #language-Azerbaijani #language-South Azerbaijani #language-Bashkir #language-Bavarian #language-bd #language-Belarusian #language-Bemba (Zambia) #language-ber #language-Bulgarian #language-Bhojpuri #language-Bambara #language-Bengali #language-Tibetan #language-bp #language-Bakhtiari #language-Breton #language-Bodo (India) #language-Bosnian #language-Bulgarian #language-by #language-Catalan #language-Chechen #language-Cebuano #language-Central Kurdish #language-Mandarin Chinese #language-cn #language-Montenegrin #language-Corsican #language-Cree #language-Crimean Tatar #language-Czech #language-Kashubian #language-Chuvash #language-Welsh #language-cz #language-Danish #language-German #language-Domung #language-Dogri (macrolanguage) #language-Lower Sorbian #language-Duala #language-Middle Dutch (ca. 1050-1350) #language-Dhivehi #language-Dzongkha #language-eg #language-Modern Greek (1453-) #language-English #language-English #language-Middle English (1100-1500) #language-Esperanto #language-Spanish #language-Estonian #language-Basque #language-Extremaduran #language-Persian #language-Finnish #language-Filipino #language-Faroese #language-French #language-French #language-Middle French (ca. 1400-1600) #language-Arpitan #language-Eastern Frisian #language-fu #language-Friulian #language-Western Frisian #language-Irish #language-gb #language-Scottish Gaelic #language-Galician #language-Gilaki #language-Middle High German (ca. 1050-1500) #language-Guarani #language-gr #language-Swiss German #language-Gujarati #language-Wayuu #language-Paraguayan Guaraní #language-Guambiano #language-Gun #language-Manx #language-Hausa #language-Hawaiian #language-Hebrew #language-Hindi #language-Chhattisgarhi #language-Croatian #language-Hunsrik #language-Upper Sorbian #language-Haitian #language-Hungarian #language-Armenian #language-Herero #language-Interlingua (International Auxiliary Language Association) #language-Indonesian #language-Interlingue #language-Igbo #language-in #language-Ido #language-Icelandic #language-Italian #language-iw #language-Japanese #language-Jamaican Creole English #language-Lojban #language-ji #language-jp #language-Japanese #language-Javanese #language-Georgian #language-Kabyle #language-Kongo #language-Kazakh #language-Kalaallisut #language-Khmer #language-Northern Kurdish #language-Kannada #language-Korean #language-Konkani (macrolanguage) #language-Kanuri #language-Karelian #language-Kashmiri #language-Kölsch #language-Kurdish #language-Cornish #language-Kirghiz #language-Latin #language-Luxembourgish #language-Lingua Franca Nova #language-Ganda #language-Limburgan #language-lk #language-Lingala #language-Lao #language-Lithuanian #language-Latgalian #language-Latvian #language-Literary Chinese #language-Maithili #language-me #language-Malagasy #language-Eastern Mari #language-Maori #language-Karbi #language-Macedonian #language-Malayalam #language-Mongolian #language-Manchu #language-Manipuri #language-Mon #language-mo #language-Marathi #language-Malay (macrolanguage) #language-Maltese #language-Burmese #language-Nauru #language-nah #language-Min Nan Chinese #language-Neapolitan #language-Norwegian Bokmål #language-Low German #language-Nepali (macrolanguage) #language-Dutch #language-Norwegian Nynorsk #language-Norwegian #language-np #language-N'Ko #language-Nyanja #language-Occitan (post 1500) #language-Ojibwa #language-Oromo #language-Oriya (macrolanguage) #language-Ossetian #language-Ottoman Turkish (1500-1928) #language-Panjabi #language-Pampanga #language-Papiamento #language-Páez #language-Old Persian (ca. 600-400 B.C.) #language-pk #language-Polish #language-Piemontese #language-pr #language-Prussian #language-Pushto #language-Portuguese #language-pu #language-qt #language-Réunion Creole French #language-Romansh #language-Romanian #language-Romany #language-Russian #language-Rusyn #language-Kinyarwanda #language-Central Okinawan #language-Sanskrit #language-Yakut #language-sai #language-Santali #language-Sardinian #language-Scots #language-Sindhi #language-Southern Kurdish #language-Northern Sami #language-Serbo-Croatian #language-Shan #language-Sinhala #language-Slovak #language-Saraiki #language-Slovenian #language-Samoan #language-Southern Sami #language-Shona #language-Somali #language-Albanian #language-Serbian #language-Southern Sotho #language-Sundanese #language-Swedish #language-Swahili (macrolanguage) #language-Silesian #language-Tamil #language-Tamil #language-Telugu #language-Tetum #language-Tajik #language-Thai #language-Tigrinya #language-Turkmen #language-Tagalog #language-Klingon #language-Tswana #language-Tonga (Tonga Islands) #language-Toki Pona #language-Turkish #language-Sediq #language-Tatar #language-Tumbuka #language-Twi #language-Tahitian #language-Central Atlas Tamazight #language-ua #language-Udmurt #language-Uighur #language-Ukrainian #language-Undetermined #language-Urdu #language-us #language-Uzbek #language-Venetian #language-Vietnamese #language-Vlaams #language-Walloon #language-Walser #language-Wolof #language-Xhosa #language-Yiddish #language-Yoruba #language-Yue Chinese #language-Standard Moroccan Tamazight #language-Chinese #language-Zulu #region-us
| # Dataset Card for Weblate Translations
A dataset containing strings from projects hosted on Weblate and their translations into other languages.
Please consider donating or contributing to Weblate if you find this dataset useful.
To avoid rows with values like "None" and "N/A" being interpreted as missing values, pass the keep_default_na parameter like this:
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License: Each sentence pair in the dataset has a corresponding license in the "license" column. This license is the one specified in the component or project containing the sentence.
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
- Machine Translation
- Language Identification
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
- Sentence pairs with empty/missing elements were dropped.
- Identical pairs were dropped.
- Trailing whitespace was stripped.
- Rows were deduplicated based on all 3 columns including "license", on a config/subset/tsv file basis. Which means that a single config might contain two identical sentence pairs with different licenses. Or a different config/subset might contain the exact same row (most likely a different variant/dialect of the same language(s)).
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
Weblate users.
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Weblate Translations\n\n\n\nA dataset containing strings from projects hosted on Weblate and their translations into other languages.\nPlease consider donating or contributing to Weblate if you find this dataset useful.\n\nTo avoid rows with values like \"None\" and \"N/A\" being interpreted as missing values, pass the keep_default_na parameter like this:",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License: Each sentence pair in the dataset has a corresponding license in the \"license\" column. This license is the one specified in the component or project containing the sentence.",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses\n\n\n- Machine Translation\n- Language Identification",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing\n\n\n\n- Sentence pairs with empty/missing elements were dropped.\n- Identical pairs were dropped.\n- Trailing whitespace was stripped.\n- Rows were deduplicated based on all 3 columns including \"license\", on a config/subset/tsv file basis. Which means that a single config might contain two identical sentence pairs with different licenses. Or a different config/subset might contain the exact same row (most likely a different variant/dialect of the same language(s)).",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?\n\n\n\nWeblate users.",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#task_categories-translation #task_categories-text2text-generation #annotations_creators-crowdsourced #size_categories-1M<n<10M #language-Afar #language-Abkhazian #language-Achinese #language-Acoli #language-Afrikaans #language-Afrihili #language-Assyrian Neo-Aramaic #language-Ainu (Japan) #language-South Levantine Arabic #language-Akan #language-Amharic #language-Aragonese #language-Old English (ca. 450-1100) #language-Angika #language-Levantine Arabic #language-Arabic #language-Mapudungun #language-Najdi Arabic #language-Assamese #language-Asturian #language-Aymara #language-Southern Aymara #language-Azerbaijani #language-South Azerbaijani #language-Bashkir #language-Bavarian #language-bd #language-Belarusian #language-Bemba (Zambia) #language-ber #language-Bulgarian #language-Bhojpuri #language-Bambara #language-Bengali #language-Tibetan #language-bp #language-Bakhtiari #language-Breton #language-Bodo (India) #language-Bosnian #language-Bulgarian #language-by #language-Catalan #language-Chechen #language-Cebuano #language-Central Kurdish #language-Mandarin Chinese #language-cn #language-Montenegrin #language-Corsican #language-Cree #language-Crimean Tatar #language-Czech #language-Kashubian #language-Chuvash #language-Welsh #language-cz #language-Danish #language-German #language-Domung #language-Dogri (macrolanguage) #language-Lower Sorbian #language-Duala #language-Middle Dutch (ca. 1050-1350) #language-Dhivehi #language-Dzongkha #language-eg #language-Modern Greek (1453-) #language-English #language-English #language-Middle English (1100-1500) #language-Esperanto #language-Spanish #language-Estonian #language-Basque #language-Extremaduran #language-Persian #language-Finnish #language-Filipino #language-Faroese #language-French #language-French #language-Middle French (ca. 1400-1600) #language-Arpitan #language-Eastern Frisian #language-fu #language-Friulian #language-Western Frisian #language-Irish #language-gb #language-Scottish Gaelic #language-Galician #language-Gilaki #language-Middle High German (ca. 1050-1500) #language-Guarani #language-gr #language-Swiss German #language-Gujarati #language-Wayuu #language-Paraguayan Guaraní #language-Guambiano #language-Gun #language-Manx #language-Hausa #language-Hawaiian #language-Hebrew #language-Hindi #language-Chhattisgarhi #language-Croatian #language-Hunsrik #language-Upper Sorbian #language-Haitian #language-Hungarian #language-Armenian #language-Herero #language-Interlingua (International Auxiliary Language Association) #language-Indonesian #language-Interlingue #language-Igbo #language-in #language-Ido #language-Icelandic #language-Italian #language-iw #language-Japanese #language-Jamaican Creole English #language-Lojban #language-ji #language-jp #language-Japanese #language-Javanese #language-Georgian #language-Kabyle #language-Kongo #language-Kazakh #language-Kalaallisut #language-Khmer #language-Northern Kurdish #language-Kannada #language-Korean #language-Konkani (macrolanguage) #language-Kanuri #language-Karelian #language-Kashmiri #language-Kölsch #language-Kurdish #language-Cornish #language-Kirghiz #language-Latin #language-Luxembourgish #language-Lingua Franca Nova #language-Ganda #language-Limburgan #language-lk #language-Lingala #language-Lao #language-Lithuanian #language-Latgalian #language-Latvian #language-Literary Chinese #language-Maithili #language-me #language-Malagasy #language-Eastern Mari #language-Maori #language-Karbi #language-Macedonian #language-Malayalam #language-Mongolian #language-Manchu #language-Manipuri #language-Mon #language-mo #language-Marathi #language-Malay (macrolanguage) #language-Maltese #language-Burmese #language-Nauru #language-nah #language-Min Nan Chinese #language-Neapolitan #language-Norwegian Bokmål #language-Low German #language-Nepali (macrolanguage) #language-Dutch #language-Norwegian Nynorsk #language-Norwegian #language-np #language-N'Ko #language-Nyanja #language-Occitan (post 1500) #language-Ojibwa #language-Oromo #language-Oriya (macrolanguage) #language-Ossetian #language-Ottoman Turkish (1500-1928) #language-Panjabi #language-Pampanga #language-Papiamento #language-Páez #language-Old Persian (ca. 600-400 B.C.) #language-pk #language-Polish #language-Piemontese #language-pr #language-Prussian #language-Pushto #language-Portuguese #language-pu #language-qt #language-Réunion Creole French #language-Romansh #language-Romanian #language-Romany #language-Russian #language-Rusyn #language-Kinyarwanda #language-Central Okinawan #language-Sanskrit #language-Yakut #language-sai #language-Santali #language-Sardinian #language-Scots #language-Sindhi #language-Southern Kurdish #language-Northern Sami #language-Serbo-Croatian #language-Shan #language-Sinhala #language-Slovak #language-Saraiki #language-Slovenian #language-Samoan #language-Southern Sami #language-Shona #language-Somali #language-Albanian #language-Serbian #language-Southern Sotho #language-Sundanese #language-Swedish #language-Swahili (macrolanguage) #language-Silesian #language-Tamil #language-Tamil #language-Telugu #language-Tetum #language-Tajik #language-Thai #language-Tigrinya #language-Turkmen #language-Tagalog #language-Klingon #language-Tswana #language-Tonga (Tonga Islands) #language-Toki Pona #language-Turkish #language-Sediq #language-Tatar #language-Tumbuka #language-Twi #language-Tahitian #language-Central Atlas Tamazight #language-ua #language-Udmurt #language-Uighur #language-Ukrainian #language-Undetermined #language-Urdu #language-us #language-Uzbek #language-Venetian #language-Vietnamese #language-Vlaams #language-Walloon #language-Walser #language-Wolof #language-Xhosa #language-Yiddish #language-Yoruba #language-Yue Chinese #language-Standard Moroccan Tamazight #language-Chinese #language-Zulu #region-us \n",
"# Dataset Card for Weblate Translations\n\n\n\nA dataset containing strings from projects hosted on Weblate and their translations into other languages.\nPlease consider donating or contributing to Weblate if you find this dataset useful.\n\nTo avoid rows with values like \"None\" and \"N/A\" being interpreted as missing values, pass the keep_default_na parameter like this:",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License: Each sentence pair in the dataset has a corresponding license in the \"license\" column. This license is the one specified in the component or project containing the sentence.",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses\n\n\n- Machine Translation\n- Language Identification",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing\n\n\n\n- Sentence pairs with empty/missing elements were dropped.\n- Identical pairs were dropped.\n- Trailing whitespace was stripped.\n- Rows were deduplicated based on all 3 columns including \"license\", on a config/subset/tsv file basis. Which means that a single config might contain two identical sentence pairs with different licenses. Or a different config/subset might contain the exact same row (most likely a different variant/dialect of the same language(s)).",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?\n\n\n\nWeblate users.",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
1802,
86,
4,
79,
29,
10,
4,
9,
6,
5,
7,
4,
127,
10,
9,
5,
13,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: "
] |
d1a07bfe6b8d894bd4b06f07c5acb2a68f1e567a | # Dataset Card for "unit-test-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | erishabh/unit-test-v2 | [
"region:us"
] | 2024-01-07T22:50:47+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "completion", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 2289947144, "num_examples": 215409}], "download_size": 239191939, "dataset_size": 2289947144}} | 2024-01-07T22:51:12+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "unit-test-v2"
More Information needed | [
"# Dataset Card for \"unit-test-v2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"unit-test-v2\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"unit-test-v2\"\n\nMore Information needed"
] |
7a987eafb3e09f293ee20c69a5d36655412cbb2a |
# Dataset Card for Evaluation run of BEE-spoke-data/TinyLlama-3T-1.1bee
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BEE-spoke-data/TinyLlama-3T-1.1bee](https://huggingface.co/BEE-spoke-data/TinyLlama-3T-1.1bee) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BEE-spoke-data__TinyLlama-3T-1.1bee",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-07T23:10:41.874868](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__TinyLlama-3T-1.1bee/blob/main/results_2024-01-07T23-10-41.874868.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2639999673426553,
"acc_stderr": 0.031097948131753084,
"acc_norm": 0.26579484215214555,
"acc_norm_stderr": 0.03189164317426905,
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301137,
"mc2": 0.38125036947687346,
"mc2_stderr": 0.014406514283056868
},
"harness|arc:challenge|25": {
"acc": 0.30887372013651876,
"acc_stderr": 0.013501770929344004,
"acc_norm": 0.3378839590443686,
"acc_norm_stderr": 0.013822047922283524
},
"harness|hellaswag|10": {
"acc": 0.44722166899024096,
"acc_stderr": 0.004961904949171383,
"acc_norm": 0.602867954590719,
"acc_norm_stderr": 0.004883037758919969
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677084,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677084
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.02713429162874171,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.02713429162874171
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.18497109826589594,
"acc_stderr": 0.02960562398177124,
"acc_norm": 0.18497109826589594,
"acc_norm_stderr": 0.02960562398177124
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.30638297872340425,
"acc_stderr": 0.03013590647851756,
"acc_norm": 0.30638297872340425,
"acc_norm_stderr": 0.03013590647851756
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131183,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131183
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184763,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184763
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.03455071019102146,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.03455071019102146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23870967741935484,
"acc_stderr": 0.024251071262208834,
"acc_norm": 0.23870967741935484,
"acc_norm_stderr": 0.024251071262208834
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.031089826002937523,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.031089826002937523
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.029857515673386407,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.029857515673386407
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24871794871794872,
"acc_stderr": 0.0219169577092138,
"acc_norm": 0.24871794871794872,
"acc_norm_stderr": 0.0219169577092138
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.025928876132766104,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.025928876132766104
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279472,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279472
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23669724770642203,
"acc_stderr": 0.01822407811729907,
"acc_norm": 0.23669724770642203,
"acc_norm_stderr": 0.01822407811729907
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.33796296296296297,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.33796296296296297,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.031660096793998116,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.031660096793998116
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.36771300448430494,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.36771300448430494,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3374233128834356,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.3374233128834356,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.02812096650391441,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.02812096650391441
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2796934865900383,
"acc_stderr": 0.016050792148036553,
"acc_norm": 0.2796934865900383,
"acc_norm_stderr": 0.016050792148036553
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21676300578034682,
"acc_stderr": 0.02218347766841285,
"acc_norm": 0.21676300578034682,
"acc_norm_stderr": 0.02218347766841285
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25027932960893856,
"acc_stderr": 0.014487500852850423,
"acc_norm": 0.25027932960893856,
"acc_norm_stderr": 0.014487500852850423
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3183279742765273,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.3183279742765273,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2716049382716049,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.2716049382716049,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2375886524822695,
"acc_stderr": 0.0253895125527299,
"acc_norm": 0.2375886524822695,
"acc_norm_stderr": 0.0253895125527299
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24771838331160365,
"acc_stderr": 0.011025499291443738,
"acc_norm": 0.24771838331160365,
"acc_norm_stderr": 0.011025499291443738
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.024398192986654917,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.024398192986654917
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528044,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528044
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.03550920185689629,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.03550920185689629
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30994152046783624,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.30994152046783624,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301137,
"mc2": 0.38125036947687346,
"mc2_stderr": 0.014406514283056868
},
"harness|winogrande|5": {
"acc": 0.6022099447513812,
"acc_stderr": 0.013755743513749027
},
"harness|gsm8k|5": {
"acc": 0.004548900682335102,
"acc_stderr": 0.0018535550440036204
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BEE-spoke-data__TinyLlama-3T-1.1bee | [
"region:us"
] | 2024-01-07T23:12:30+00:00 | {"pretty_name": "Evaluation run of BEE-spoke-data/TinyLlama-3T-1.1bee", "dataset_summary": "Dataset automatically created during the evaluation run of model [BEE-spoke-data/TinyLlama-3T-1.1bee](https://huggingface.co/BEE-spoke-data/TinyLlama-3T-1.1bee) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BEE-spoke-data__TinyLlama-3T-1.1bee\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-07T23:10:41.874868](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__TinyLlama-3T-1.1bee/blob/main/results_2024-01-07T23-10-41.874868.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2639999673426553,\n \"acc_stderr\": 0.031097948131753084,\n \"acc_norm\": 0.26579484215214555,\n \"acc_norm_stderr\": 0.03189164317426905,\n \"mc1\": 0.2215422276621787,\n \"mc1_stderr\": 0.014537867601301137,\n \"mc2\": 0.38125036947687346,\n \"mc2_stderr\": 0.014406514283056868\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.30887372013651876,\n \"acc_stderr\": 0.013501770929344004,\n \"acc_norm\": 0.3378839590443686,\n \"acc_norm_stderr\": 0.013822047922283524\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44722166899024096,\n \"acc_stderr\": 0.004961904949171383,\n \"acc_norm\": 0.602867954590719,\n \"acc_norm_stderr\": 0.004883037758919969\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677084,\n \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677084\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.02713429162874171,\n \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.02713429162874171\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.18497109826589594,\n \"acc_stderr\": 0.02960562398177124,\n \"acc_norm\": 0.18497109826589594,\n \"acc_norm_stderr\": 0.02960562398177124\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.30638297872340425,\n \"acc_stderr\": 0.03013590647851756,\n \"acc_norm\": 0.30638297872340425,\n \"acc_norm_stderr\": 0.03013590647851756\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131183,\n \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131183\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184763,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184763\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n \"acc_stderr\": 0.03455071019102146,\n \"acc_norm\": 0.18253968253968253,\n \"acc_norm_stderr\": 0.03455071019102146\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23870967741935484,\n \"acc_stderr\": 0.024251071262208834,\n \"acc_norm\": 0.23870967741935484,\n \"acc_norm_stderr\": 0.024251071262208834\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.031089826002937523,\n \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.031089826002937523\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.029857515673386407,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.029857515673386407\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752954,\n \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752954\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.24871794871794872,\n \"acc_stderr\": 0.0219169577092138,\n \"acc_norm\": 0.24871794871794872,\n \"acc_norm_stderr\": 0.0219169577092138\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766104,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766104\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279472,\n \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279472\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23669724770642203,\n \"acc_stderr\": 0.01822407811729907,\n \"acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.01822407811729907\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.33796296296296297,\n \"acc_stderr\": 0.03225941352631295,\n \"acc_norm\": 0.33796296296296297,\n \"acc_norm_stderr\": 0.03225941352631295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.031660096793998116,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.031660096793998116\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.36771300448430494,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.36771300448430494,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3374233128834356,\n \"acc_stderr\": 0.03714908409935574,\n \"acc_norm\": 0.3374233128834356,\n \"acc_norm_stderr\": 0.03714908409935574\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n \"acc_stderr\": 0.02812096650391441,\n \"acc_norm\": 0.24358974358974358,\n \"acc_norm_stderr\": 0.02812096650391441\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2796934865900383,\n \"acc_stderr\": 0.016050792148036553,\n \"acc_norm\": 0.2796934865900383,\n \"acc_norm_stderr\": 0.016050792148036553\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.21676300578034682,\n \"acc_stderr\": 0.02218347766841285,\n \"acc_norm\": 0.21676300578034682,\n \"acc_norm_stderr\": 0.02218347766841285\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n \"acc_stderr\": 0.014487500852850423,\n \"acc_norm\": 0.25027932960893856,\n \"acc_norm_stderr\": 0.014487500852850423\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3183279742765273,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.3183279742765273,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2375886524822695,\n \"acc_stderr\": 0.0253895125527299,\n \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.0253895125527299\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24771838331160365,\n \"acc_stderr\": 0.011025499291443738,\n \"acc_norm\": 0.24771838331160365,\n \"acc_norm_stderr\": 0.011025499291443738\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.024398192986654917,\n \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.024398192986654917\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528044,\n \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528044\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.2818181818181818,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n \"acc_stderr\": 0.03550920185689629,\n \"acc_norm\": 0.29518072289156627,\n \"acc_norm_stderr\": 0.03550920185689629\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.30994152046783624,\n \"acc_stderr\": 0.035469769593931624,\n \"acc_norm\": 0.30994152046783624,\n \"acc_norm_stderr\": 0.035469769593931624\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2215422276621787,\n \"mc1_stderr\": 0.014537867601301137,\n \"mc2\": 0.38125036947687346,\n \"mc2_stderr\": 0.014406514283056868\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6022099447513812,\n \"acc_stderr\": 0.013755743513749027\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.004548900682335102,\n \"acc_stderr\": 0.0018535550440036204\n }\n}\n```", "repo_url": "https://huggingface.co/BEE-spoke-data/TinyLlama-3T-1.1bee", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|arc:challenge|25_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|gsm8k|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hellaswag|10_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T23-10-41.874868.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["**/details_harness|winogrande|5_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-07T23-10-41.874868.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T23_10_41.874868", "path": ["results_2024-01-07T23-10-41.874868.parquet"]}, {"split": "latest", "path": ["results_2024-01-07T23-10-41.874868.parquet"]}]}]} | 2024-01-07T23:12:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BEE-spoke-data/TinyLlama-3T-1.1bee
Dataset automatically created during the evaluation run of model BEE-spoke-data/TinyLlama-3T-1.1bee on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-07T23:10:41.874868(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BEE-spoke-data/TinyLlama-3T-1.1bee\n\n\n\nDataset automatically created during the evaluation run of model BEE-spoke-data/TinyLlama-3T-1.1bee on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T23:10:41.874868(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BEE-spoke-data/TinyLlama-3T-1.1bee\n\n\n\nDataset automatically created during the evaluation run of model BEE-spoke-data/TinyLlama-3T-1.1bee on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T23:10:41.874868(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
197,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of BEE-spoke-data/TinyLlama-3T-1.1bee\n\n\n\nDataset automatically created during the evaluation run of model BEE-spoke-data/TinyLlama-3T-1.1bee on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-07T23:10:41.874868(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
678ae5e9425ed794e5e903b3f8e21b0a1c5341aa |
# Dataset Card for Evaluation run of NeuralNovel/Aeryth-7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NeuralNovel/Aeryth-7B-v0.1](https://huggingface.co/NeuralNovel/Aeryth-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NeuralNovel__Aeryth-7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T12:31:11.639995](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Aeryth-7B-v0.1/blob/main/results_2024-01-14T12-31-11.639995.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.607832340017972,
"acc_stderr": 0.033171072669556316,
"acc_norm": 0.6134606437151463,
"acc_norm_stderr": 0.03384290514267795,
"mc1": 0.4602203182374541,
"mc1_stderr": 0.01744801722396088,
"mc2": 0.6357466374094296,
"mc2_stderr": 0.015661867399479723
},
"harness|arc:challenge|25": {
"acc": 0.5631399317406144,
"acc_stderr": 0.014494421584256524,
"acc_norm": 0.6032423208191127,
"acc_norm_stderr": 0.014296513020180646
},
"harness|hellaswag|10": {
"acc": 0.6514638518223461,
"acc_stderr": 0.004755329243976671,
"acc_norm": 0.835291774546903,
"acc_norm_stderr": 0.0037015895712743134
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.04043461861916747,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.04043461861916747
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36772486772486773,
"acc_stderr": 0.024833839825562417,
"acc_norm": 0.36772486772486773,
"acc_norm_stderr": 0.024833839825562417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6903225806451613,
"acc_stderr": 0.026302774983517414,
"acc_norm": 0.6903225806451613,
"acc_norm_stderr": 0.026302774983517414
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124488,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153303,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153303
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5666666666666667,
"acc_stderr": 0.025124653525885117,
"acc_norm": 0.5666666666666667,
"acc_norm_stderr": 0.025124653525885117
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.017208579357787586,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.017208579357787586
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501954,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501954
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.032443052830087304,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.032443052830087304
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098825,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098825
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.045416094465039504,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.045416094465039504
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077785,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077785
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.01471168438613996,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.01471168438613996
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.02507071371915319,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.02507071371915319
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34972067039106147,
"acc_stderr": 0.015949308790233645,
"acc_norm": 0.34972067039106147,
"acc_norm_stderr": 0.015949308790233645
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43285528031290743,
"acc_stderr": 0.012654565234622866,
"acc_norm": 0.43285528031290743,
"acc_norm_stderr": 0.012654565234622866
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.01961085147488029,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.01961085147488029
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7910447761194029,
"acc_stderr": 0.028748298931728655,
"acc_norm": 0.7910447761194029,
"acc_norm_stderr": 0.028748298931728655
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4602203182374541,
"mc1_stderr": 0.01744801722396088,
"mc2": 0.6357466374094296,
"mc2_stderr": 0.015661867399479723
},
"harness|winogrande|5": {
"acc": 0.7466456195737964,
"acc_stderr": 0.01222375443423362
},
"harness|gsm8k|5": {
"acc": 0.36087945413191813,
"acc_stderr": 0.01322862675392514
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NeuralNovel__Aeryth-7B-v0.1 | [
"region:us"
] | 2024-01-07T23:24:19+00:00 | {"pretty_name": "Evaluation run of NeuralNovel/Aeryth-7B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [NeuralNovel/Aeryth-7B-v0.1](https://huggingface.co/NeuralNovel/Aeryth-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeuralNovel__Aeryth-7B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T12:31:11.639995](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Aeryth-7B-v0.1/blob/main/results_2024-01-14T12-31-11.639995.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.607832340017972,\n \"acc_stderr\": 0.033171072669556316,\n \"acc_norm\": 0.6134606437151463,\n \"acc_norm_stderr\": 0.03384290514267795,\n \"mc1\": 0.4602203182374541,\n \"mc1_stderr\": 0.01744801722396088,\n \"mc2\": 0.6357466374094296,\n \"mc2_stderr\": 0.015661867399479723\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5631399317406144,\n \"acc_stderr\": 0.014494421584256524,\n \"acc_norm\": 0.6032423208191127,\n \"acc_norm_stderr\": 0.014296513020180646\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6514638518223461,\n \"acc_stderr\": 0.004755329243976671,\n \"acc_norm\": 0.835291774546903,\n \"acc_norm_stderr\": 0.0037015895712743134\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.04043461861916747,\n \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.04043461861916747\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562417,\n \"acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562417\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6903225806451613,\n \"acc_stderr\": 0.026302774983517414,\n \"acc_norm\": 0.6903225806451613,\n \"acc_norm_stderr\": 0.026302774983517414\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124488,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124488\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153303,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153303\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5666666666666667,\n \"acc_stderr\": 0.025124653525885117,\n \"acc_norm\": 0.5666666666666667,\n \"acc_norm_stderr\": 0.025124653525885117\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7981651376146789,\n \"acc_stderr\": 0.017208579357787586,\n \"acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.017208579357787586\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501954,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501954\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.6278026905829597,\n \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098825,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098825\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039504,\n \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039504\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077785,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077785\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n \"acc_stderr\": 0.01471168438613996,\n \"acc_norm\": 0.7841634738186463,\n \"acc_norm_stderr\": 0.01471168438613996\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.02507071371915319,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.02507071371915319\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n \"acc_stderr\": 0.015949308790233645,\n \"acc_norm\": 0.34972067039106147,\n \"acc_norm_stderr\": 0.015949308790233645\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43285528031290743,\n \"acc_stderr\": 0.012654565234622866,\n \"acc_norm\": 0.43285528031290743,\n \"acc_norm_stderr\": 0.012654565234622866\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6225490196078431,\n \"acc_stderr\": 0.01961085147488029,\n \"acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.01961085147488029\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4602203182374541,\n \"mc1_stderr\": 0.01744801722396088,\n \"mc2\": 0.6357466374094296,\n \"mc2_stderr\": 0.015661867399479723\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7466456195737964,\n \"acc_stderr\": 0.01222375443423362\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36087945413191813,\n \"acc_stderr\": 0.01322862675392514\n }\n}\n```", "repo_url": "https://huggingface.co/NeuralNovel/Aeryth-7B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|arc:challenge|25_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|arc:challenge|25_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|arc:challenge|25_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|arc:challenge|25_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|gsm8k|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|gsm8k|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|gsm8k|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|gsm8k|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hellaswag|10_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hellaswag|10_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hellaswag|10_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hellaswag|10_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T23-22-00.392280.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T00-11-57.804296.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T23-38-01.089688.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T12-31-11.639995.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["**/details_harness|winogrande|5_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["**/details_harness|winogrande|5_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["**/details_harness|winogrande|5_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["**/details_harness|winogrande|5_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T12-31-11.639995.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T23_22_00.392280", "path": ["results_2024-01-07T23-22-00.392280.parquet"]}, {"split": "2024_01_08T00_11_57.804296", "path": ["results_2024-01-08T00-11-57.804296.parquet"]}, {"split": "2024_01_13T23_38_01.089688", "path": ["results_2024-01-13T23-38-01.089688.parquet"]}, {"split": "2024_01_14T12_31_11.639995", "path": ["results_2024-01-14T12-31-11.639995.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T12-31-11.639995.parquet"]}]}]} | 2024-01-14T12:33:54+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of NeuralNovel/Aeryth-7B-v0.1
Dataset automatically created during the evaluation run of model NeuralNovel/Aeryth-7B-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T12:31:11.639995(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of NeuralNovel/Aeryth-7B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model NeuralNovel/Aeryth-7B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T12:31:11.639995(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NeuralNovel/Aeryth-7B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model NeuralNovel/Aeryth-7B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T12:31:11.639995(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of NeuralNovel/Aeryth-7B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model NeuralNovel/Aeryth-7B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-14T12:31:11.639995(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
89087a1ea7ba831a34b9d3101281a6f1ff9334ea | # HistoryTrans
HistoryTrans 是一个古文翻译数据集,通过数据预处理和质量控制,来提高古文翻译的质量和实用性。
参考我们的项目主页[HistoryTrans古文翻译](https://github.com/HistoryTrans/HistoryTrans).
## 数据集详细信息
### 数据集来源
- **主体:** [Classical-Modern](https://github.com/NiuTrans/Classical-Modern)
- **额外补充:** :《二十四史》和《清史稿》中提取
### 数据集结构
数据集包含以下 JSONL 文件:
- `train_01_04.jsonl`: 训练集,主要用于训练翻译模型。
- `val_01_04.jsonl`: 验证集,用于训练过程中的模型微调和评估。
- `test_01_04.jsonl`: 测试集,用于评估最终模型性能。
每个 JSON 对象包括:
- `inputs`: 原始古文
- `truth`: 准确翻译
例如:
```json
{"inputs": "昕曰: 回纥之功,唐已报之矣。", "truth": "萧昕反驳说: 回纥的功劳,唐朝已经报答了。"}
{"inputs": "然县令所犯在恩前,中人所犯在恩后。", "truth": "但是县令所犯罪过在施恩大赦之前,宦官所犯罪过在施恩赦免之后。"}
``` | HistoryTrans/Dataset | [
"task_categories:translation",
"size_categories:100M<n<1B",
"language:zh",
"license:mit",
"translation",
"古文翻译",
"文言文翻译",
"region:us"
] | 2024-01-07T23:53:26+00:00 | {"language": ["zh"], "license": "mit", "size_categories": ["100M<n<1B"], "task_categories": ["translation"], "pretty_name": "\u53e4\u6587\u7ffb\u8bd1\u6570\u636e\u96c6", "tags": ["translation", "\u53e4\u6587\u7ffb\u8bd1", "\u6587\u8a00\u6587\u7ffb\u8bd1"]} | 2024-01-09T21:03:47+00:00 | [] | [
"zh"
] | TAGS
#task_categories-translation #size_categories-100M<n<1B #language-Chinese #license-mit #translation #古文翻译 #文言文翻译 #region-us
| # HistoryTrans
HistoryTrans 是一个古文翻译数据集,通过数据预处理和质量控制,来提高古文翻译的质量和实用性。
参考我们的项目主页HistoryTrans古文翻译.
## 数据集详细信息
### 数据集来源
- 主体: Classical-Modern
- 额外补充: :《二十四史》和《清史稿》中提取
### 数据集结构
数据集包含以下 JSONL 文件:
- 'train_01_04.jsonl': 训练集,主要用于训练翻译模型。
- 'val_01_04.jsonl': 验证集,用于训练过程中的模型微调和评估。
- 'test_01_04.jsonl': 测试集,用于评估最终模型性能。
每个 JSON 对象包括:
- 'inputs': 原始古文
- 'truth': 准确翻译
例如:
| [
"# HistoryTrans\n\nHistoryTrans 是一个古文翻译数据集,通过数据预处理和质量控制,来提高古文翻译的质量和实用性。\n\n参考我们的项目主页HistoryTrans古文翻译.",
"## 数据集详细信息",
"### 数据集来源\n\n- 主体: Classical-Modern\n- 额外补充: :《二十四史》和《清史稿》中提取",
"### 数据集结构\n\n数据集包含以下 JSONL 文件:\n\n- 'train_01_04.jsonl': 训练集,主要用于训练翻译模型。\n- 'val_01_04.jsonl': 验证集,用于训练过程中的模型微调和评估。\n- 'test_01_04.jsonl': 测试集,用于评估最终模型性能。\n\n每个 JSON 对象包括:\n\n- 'inputs': 原始古文\n- 'truth': 准确翻译\n\n例如:"
] | [
"TAGS\n#task_categories-translation #size_categories-100M<n<1B #language-Chinese #license-mit #translation #古文翻译 #文言文翻译 #region-us \n",
"# HistoryTrans\n\nHistoryTrans 是一个古文翻译数据集,通过数据预处理和质量控制,来提高古文翻译的质量和实用性。\n\n参考我们的项目主页HistoryTrans古文翻译.",
"## 数据集详细信息",
"### 数据集来源\n\n- 主体: Classical-Modern\n- 额外补充: :《二十四史》和《清史稿》中提取",
"### 数据集结构\n\n数据集包含以下 JSONL 文件:\n\n- 'train_01_04.jsonl': 训练集,主要用于训练翻译模型。\n- 'val_01_04.jsonl': 验证集,用于训练过程中的模型微调和评估。\n- 'test_01_04.jsonl': 测试集,用于评估最终模型性能。\n\n每个 JSON 对象包括:\n\n- 'inputs': 原始古文\n- 'truth': 准确翻译\n\n例如:"
] | [
49,
45,
6,
35,
119
] | [
"passage: TAGS\n#task_categories-translation #size_categories-100M<n<1B #language-Chinese #license-mit #translation #古文翻译 #文言文翻译 #region-us \n# HistoryTrans\n\nHistoryTrans 是一个古文翻译数据集,通过数据预处理和质量控制,来提高古文翻译的质量和实用性。\n\n参考我们的项目主页HistoryTrans古文翻译.## 数据集详细信息### 数据集来源\n\n- 主体: Classical-Modern\n- 额外补充: :《二十四史》和《清史稿》中提取### 数据集结构\n\n数据集包含以下 JSONL 文件:\n\n- 'train_01_04.jsonl': 训练集,主要用于训练翻译模型。\n- 'val_01_04.jsonl': 验证集,用于训练过程中的模型微调和评估。\n- 'test_01_04.jsonl': 测试集,用于评估最终模型性能。\n\n每个 JSON 对象包括:\n\n- 'inputs': 原始古文\n- 'truth': 准确翻译\n\n例如:"
] |
08cb0f0ecb2e25dffbd5325133a461654288c478 |
# Dataset Card for Evaluation run of princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT](https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B-ShareGPT",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-07T23:59:12.319843](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B-ShareGPT/blob/main/results_2024-01-07T23-59-12.319843.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2694073014973233,
"acc_stderr": 0.031115984816531068,
"acc_norm": 0.2715715014019466,
"acc_norm_stderr": 0.03192187260750218,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557994,
"mc2": 0.43034383734131576,
"mc2_stderr": 0.014837180597154165
},
"harness|arc:challenge|25": {
"acc": 0.3250853242320819,
"acc_stderr": 0.013688147309729117,
"acc_norm": 0.3395904436860068,
"acc_norm_stderr": 0.013839039762820167
},
"harness|hellaswag|10": {
"acc": 0.48207528380800635,
"acc_stderr": 0.004986573992451682,
"acc_norm": 0.6254730133439554,
"acc_norm_stderr": 0.004830113797327044
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.037125378336148665,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.037125378336148665
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3223684210526316,
"acc_stderr": 0.03803510248351586,
"acc_norm": 0.3223684210526316,
"acc_norm_stderr": 0.03803510248351586
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749905,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749905
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.02802022627120022,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.02802022627120022
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.21379310344827587,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.21379310344827587,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643895,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643895
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2064516129032258,
"acc_stderr": 0.023025899617188733,
"acc_norm": 0.2064516129032258,
"acc_norm_stderr": 0.023025899617188733
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293752,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293752
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.033464098810559534,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.033464098810559534
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2849740932642487,
"acc_stderr": 0.0325771407770966,
"acc_norm": 0.2849740932642487,
"acc_norm_stderr": 0.0325771407770966
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.024433016466052455,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.024433016466052455
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2605042016806723,
"acc_stderr": 0.028510251512341923,
"acc_norm": 0.2605042016806723,
"acc_norm_stderr": 0.028510251512341923
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.29724770642201837,
"acc_stderr": 0.01959570722464354,
"acc_norm": 0.29724770642201837,
"acc_norm_stderr": 0.01959570722464354
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.03132179803083289,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.03132179803083289
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.17937219730941703,
"acc_stderr": 0.025749819569192804,
"acc_norm": 0.17937219730941703,
"acc_norm_stderr": 0.025749819569192804
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.037276735755969195,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.037276735755969195
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.125,
"acc_stderr": 0.03139045014587016,
"acc_norm": 0.125,
"acc_norm_stderr": 0.03139045014587016
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260594,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260594
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3247863247863248,
"acc_stderr": 0.030679022765498835,
"acc_norm": 0.3247863247863248,
"acc_norm_stderr": 0.030679022765498835
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24393358876117496,
"acc_stderr": 0.015357212665829465,
"acc_norm": 0.24393358876117496,
"acc_norm_stderr": 0.015357212665829465
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.024257901705323374,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.024257901705323374
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3022508038585209,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.3022508038585209,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.1882716049382716,
"acc_stderr": 0.02175186606081588,
"acc_norm": 0.1882716049382716,
"acc_norm_stderr": 0.02175186606081588
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26401564537157757,
"acc_stderr": 0.011258435537723818,
"acc_norm": 0.26401564537157757,
"acc_norm_stderr": 0.011258435537723818
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3382352941176471,
"acc_stderr": 0.02873932851398358,
"acc_norm": 0.3382352941176471,
"acc_norm_stderr": 0.02873932851398358
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.043091187099464606,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.043091187099464606
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23265306122448978,
"acc_stderr": 0.02704925791589618,
"acc_norm": 0.23265306122448978,
"acc_norm_stderr": 0.02704925791589618
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401467,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401467
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.034240429246915824,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.034240429246915824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557994,
"mc2": 0.43034383734131576,
"mc2_stderr": 0.014837180597154165
},
"harness|winogrande|5": {
"acc": 0.5682715074980268,
"acc_stderr": 0.013920872110010711
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225278
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B-ShareGPT | [
"region:us"
] | 2024-01-08T00:01:38+00:00 | {"pretty_name": "Evaluation run of princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT", "dataset_summary": "Dataset automatically created during the evaluation run of model [princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT](https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B-ShareGPT\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-07T23:59:12.319843](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B-ShareGPT/blob/main/results_2024-01-07T23-59-12.319843.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2694073014973233,\n \"acc_stderr\": 0.031115984816531068,\n \"acc_norm\": 0.2715715014019466,\n \"acc_norm_stderr\": 0.03192187260750218,\n \"mc1\": 0.2594859241126071,\n \"mc1_stderr\": 0.015345409485557994,\n \"mc2\": 0.43034383734131576,\n \"mc2_stderr\": 0.014837180597154165\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3250853242320819,\n \"acc_stderr\": 0.013688147309729117,\n \"acc_norm\": 0.3395904436860068,\n \"acc_norm_stderr\": 0.013839039762820167\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.48207528380800635,\n \"acc_stderr\": 0.004986573992451682,\n \"acc_norm\": 0.6254730133439554,\n \"acc_norm_stderr\": 0.004830113797327044\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.037125378336148665,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.037125378336148665\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3223684210526316,\n \"acc_stderr\": 0.03803510248351586,\n \"acc_norm\": 0.3223684210526316,\n \"acc_norm_stderr\": 0.03803510248351586\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493857,\n \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493857\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749905,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749905\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.02802022627120022,\n \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.02802022627120022\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.21379310344827587,\n \"acc_stderr\": 0.034165204477475494,\n \"acc_norm\": 0.21379310344827587,\n \"acc_norm_stderr\": 0.034165204477475494\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2064516129032258,\n \"acc_stderr\": 0.023025899617188733,\n \"acc_norm\": 0.2064516129032258,\n \"acc_norm_stderr\": 0.023025899617188733\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293752,\n \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293752\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.033464098810559534,\n \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.033464098810559534\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.2849740932642487,\n \"acc_stderr\": 0.0325771407770966,\n \"acc_norm\": 0.2849740932642487,\n \"acc_norm_stderr\": 0.0325771407770966\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.024433016466052455,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.024433016466052455\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341923,\n \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341923\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.29724770642201837,\n \"acc_stderr\": 0.01959570722464354,\n \"acc_norm\": 0.29724770642201837,\n \"acc_norm_stderr\": 0.01959570722464354\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083289,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083289\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.17937219730941703,\n \"acc_stderr\": 0.025749819569192804,\n \"acc_norm\": 0.17937219730941703,\n \"acc_norm_stderr\": 0.025749819569192804\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969195,\n \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969195\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.125,\n \"acc_stderr\": 0.03139045014587016,\n \"acc_norm\": 0.125,\n \"acc_norm_stderr\": 0.03139045014587016\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260594,\n \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260594\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3247863247863248,\n \"acc_stderr\": 0.030679022765498835,\n \"acc_norm\": 0.3247863247863248,\n \"acc_norm_stderr\": 0.030679022765498835\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24393358876117496,\n \"acc_stderr\": 0.015357212665829465,\n \"acc_norm\": 0.24393358876117496,\n \"acc_norm_stderr\": 0.015357212665829465\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.024257901705323374,\n \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.024257901705323374\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2679738562091503,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3022508038585209,\n \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.3022508038585209,\n \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.1882716049382716,\n \"acc_stderr\": 0.02175186606081588,\n \"acc_norm\": 0.1882716049382716,\n \"acc_norm_stderr\": 0.02175186606081588\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26401564537157757,\n \"acc_stderr\": 0.011258435537723818,\n \"acc_norm\": 0.26401564537157757,\n \"acc_norm_stderr\": 0.011258435537723818\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3382352941176471,\n \"acc_stderr\": 0.02873932851398358,\n \"acc_norm\": 0.3382352941176471,\n \"acc_norm_stderr\": 0.02873932851398358\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n \"acc_stderr\": 0.043091187099464606,\n \"acc_norm\": 0.2818181818181818,\n \"acc_norm_stderr\": 0.043091187099464606\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.23265306122448978,\n \"acc_stderr\": 0.02704925791589618,\n \"acc_norm\": 0.23265306122448978,\n \"acc_norm_stderr\": 0.02704925791589618\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401467,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401467\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.034240429246915824,\n \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.034240429246915824\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n \"mc1_stderr\": 0.015345409485557994,\n \"mc2\": 0.43034383734131576,\n \"mc2_stderr\": 0.014837180597154165\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5682715074980268,\n \"acc_stderr\": 0.013920872110010711\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.0007581501137225278\n }\n}\n```", "repo_url": "https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|arc:challenge|25_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|gsm8k|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hellaswag|10_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-07T23-59-12.319843.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["**/details_harness|winogrande|5_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-07T23-59-12.319843.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_07T23_59_12.319843", "path": ["results_2024-01-07T23-59-12.319843.parquet"]}, {"split": "latest", "path": ["results_2024-01-07T23-59-12.319843.parquet"]}]}]} | 2024-01-08T00:01:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT
Dataset automatically created during the evaluation run of model princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-07T23:59:12.319843(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT\n\n\n\nDataset automatically created during the evaluation run of model princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T23:59:12.319843(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT\n\n\n\nDataset automatically created during the evaluation run of model princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-07T23:59:12.319843(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
201,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT\n\n\n\nDataset automatically created during the evaluation run of model princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-07T23:59:12.319843(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
29e65374a4fca0b1797a2b23c52de18c8b8d02f7 | # Dataset Card for "autotree_snnxor_n15_l2_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | yzhuang/autotree_snnxor_n15_l2_10 | [
"region:us"
] | 2024-01-08T00:02:33+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "input_x", "sequence": {"sequence": "float32"}}, {"name": "input_y", "sequence": {"sequence": "float32"}}, {"name": "input_y_clean", "sequence": {"sequence": "float32"}}, {"name": "rtg", "sequence": "float64"}, {"name": "status", "sequence": {"sequence": "float32"}}, {"name": "split_threshold", "sequence": {"sequence": "float32"}}, {"name": "split_dimension", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 484120000, "num_examples": 10000}, {"name": "validation", "num_bytes": 484120000, "num_examples": 10000}, {"name": "test", "num_bytes": 484120000, "num_examples": 10000}], "download_size": 597791512, "dataset_size": 1452360000}} | 2024-01-08T00:03:12+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "autotree_snnxor_n15_l2_10"
More Information needed | [
"# Dataset Card for \"autotree_snnxor_n15_l2_10\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"autotree_snnxor_n15_l2_10\"\n\nMore Information needed"
] | [
6,
24
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"autotree_snnxor_n15_l2_10\"\n\nMore Information needed"
] |
d8c46c6235877e7b50093d716f5d2ed7a795f6b8 |
# Dataset Card for Evaluation run of maywell/TinyLlama-MoE-Chat-0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [maywell/TinyLlama-MoE-Chat-0.1](https://huggingface.co/maywell/TinyLlama-MoE-Chat-0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat-0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T02:02:06.630482](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat-0.1/blob/main/results_2024-01-08T02-02-06.630482.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2969807305972257,
"acc_stderr": 0.03233735615614679,
"acc_norm": 0.2990966138461531,
"acc_norm_stderr": 0.03313317327044684,
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456411,
"mc2": 0.3781526709576764,
"mc2_stderr": 0.01431580872082323
},
"harness|arc:challenge|25": {
"acc": 0.3267918088737201,
"acc_stderr": 0.013706665975587335,
"acc_norm": 0.3438566552901024,
"acc_norm_stderr": 0.01388064457015621
},
"harness|hellaswag|10": {
"acc": 0.43397729535949015,
"acc_stderr": 0.0049460892301530284,
"acc_norm": 0.5672176857199761,
"acc_norm_stderr": 0.004944485990639527
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.041539484047424,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.041539484047424
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3223684210526316,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.3223684210526316,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.027008766090708094,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.027008766090708094
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3263888888888889,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.3263888888888889,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307809,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307809
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.030363582197238167,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.030363582197238167
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.31724137931034485,
"acc_stderr": 0.03878352372138622,
"acc_norm": 0.31724137931034485,
"acc_norm_stderr": 0.03878352372138622
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.02300008685906864,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.02300008685906864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2709677419354839,
"acc_stderr": 0.025284416114900156,
"acc_norm": 0.2709677419354839,
"acc_norm_stderr": 0.025284416114900156
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.03144712581678242,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.03144712581678242
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.034273086529999365,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.034273086529999365
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27979274611398963,
"acc_stderr": 0.03239637046735703,
"acc_norm": 0.27979274611398963,
"acc_norm_stderr": 0.03239637046735703
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.28974358974358977,
"acc_stderr": 0.02300062824368795,
"acc_norm": 0.28974358974358977,
"acc_norm_stderr": 0.02300062824368795
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275798,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275798
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02934457250063435,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02934457250063435
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26788990825688075,
"acc_stderr": 0.018987462257978652,
"acc_norm": 0.26788990825688075,
"acc_norm_stderr": 0.018987462257978652
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03214952147802747,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03214952147802747
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.03149328104507957,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.03149328104507957
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3459915611814346,
"acc_stderr": 0.030964810588786706,
"acc_norm": 0.3459915611814346,
"acc_norm_stderr": 0.030964810588786706
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3140495867768595,
"acc_stderr": 0.04236964753041019,
"acc_norm": 0.3140495867768595,
"acc_norm_stderr": 0.04236964753041019
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.34355828220858897,
"acc_stderr": 0.03731133519673893,
"acc_norm": 0.34355828220858897,
"acc_norm_stderr": 0.03731133519673893
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.37606837606837606,
"acc_stderr": 0.03173393632969481,
"acc_norm": 0.37606837606837606,
"acc_norm_stderr": 0.03173393632969481
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.31800766283524906,
"acc_stderr": 0.016653486275615404,
"acc_norm": 0.31800766283524906,
"acc_norm_stderr": 0.016653486275615404
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26878612716763006,
"acc_stderr": 0.023868003262500114,
"acc_norm": 0.26878612716763006,
"acc_norm_stderr": 0.023868003262500114
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961445,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961445
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.35691318327974275,
"acc_stderr": 0.02721042037593402,
"acc_norm": 0.35691318327974275,
"acc_norm_stderr": 0.02721042037593402
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3117283950617284,
"acc_stderr": 0.02577311116963045,
"acc_norm": 0.3117283950617284,
"acc_norm_stderr": 0.02577311116963045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2907801418439716,
"acc_stderr": 0.027090664368353178,
"acc_norm": 0.2907801418439716,
"acc_norm_stderr": 0.027090664368353178
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27640156453715775,
"acc_stderr": 0.011422153194553577,
"acc_norm": 0.27640156453715775,
"acc_norm_stderr": 0.011422153194553577
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.272875816993464,
"acc_stderr": 0.01802047414839358,
"acc_norm": 0.272875816993464,
"acc_norm_stderr": 0.01802047414839358
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.38181818181818183,
"acc_stderr": 0.04653429807913509,
"acc_norm": 0.38181818181818183,
"acc_norm_stderr": 0.04653429807913509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23673469387755103,
"acc_stderr": 0.02721283588407315,
"acc_norm": 0.23673469387755103,
"acc_norm_stderr": 0.02721283588407315
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3373493975903614,
"acc_stderr": 0.03680783690727581,
"acc_norm": 0.3373493975903614,
"acc_norm_stderr": 0.03680783690727581
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.03377310252209194,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.03377310252209194
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456411,
"mc2": 0.3781526709576764,
"mc2_stderr": 0.01431580872082323
},
"harness|winogrande|5": {
"acc": 0.5966850828729282,
"acc_stderr": 0.013787257285896245
},
"harness|gsm8k|5": {
"acc": 0.022744503411675512,
"acc_stderr": 0.00410662063774967
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat-0.1 | [
"region:us"
] | 2024-01-08T00:07:51+00:00 | {"pretty_name": "Evaluation run of maywell/TinyLlama-MoE-Chat-0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [maywell/TinyLlama-MoE-Chat-0.1](https://huggingface.co/maywell/TinyLlama-MoE-Chat-0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat-0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-08T02:02:06.630482](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat-0.1/blob/main/results_2024-01-08T02-02-06.630482.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2969807305972257,\n \"acc_stderr\": 0.03233735615614679,\n \"acc_norm\": 0.2990966138461531,\n \"acc_norm_stderr\": 0.03313317327044684,\n \"mc1\": 0.2386780905752754,\n \"mc1_stderr\": 0.014922629695456411,\n \"mc2\": 0.3781526709576764,\n \"mc2_stderr\": 0.01431580872082323\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3267918088737201,\n \"acc_stderr\": 0.013706665975587335,\n \"acc_norm\": 0.3438566552901024,\n \"acc_norm_stderr\": 0.01388064457015621\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.43397729535949015,\n \"acc_stderr\": 0.0049460892301530284,\n \"acc_norm\": 0.5672176857199761,\n \"acc_norm_stderr\": 0.004944485990639527\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.041539484047424,\n \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.041539484047424\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3223684210526316,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.3223684210526316,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708094,\n \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708094\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3263888888888889,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.3263888888888889,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.030363582197238167,\n \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.030363582197238167\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.31724137931034485,\n \"acc_stderr\": 0.03878352372138622,\n \"acc_norm\": 0.31724137931034485,\n \"acc_norm_stderr\": 0.03878352372138622\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2751322751322751,\n \"acc_stderr\": 0.02300008685906864,\n \"acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.02300008685906864\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2709677419354839,\n \"acc_stderr\": 0.025284416114900156,\n \"acc_norm\": 0.2709677419354839,\n \"acc_norm_stderr\": 0.025284416114900156\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03144712581678242,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03144712581678242\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.36363636363636365,\n \"acc_stderr\": 0.034273086529999365,\n \"acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.034273086529999365\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.27979274611398963,\n \"acc_stderr\": 0.03239637046735703,\n \"acc_norm\": 0.27979274611398963,\n \"acc_norm_stderr\": 0.03239637046735703\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.28974358974358977,\n \"acc_stderr\": 0.02300062824368795,\n \"acc_norm\": 0.28974358974358977,\n \"acc_norm_stderr\": 0.02300062824368795\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275798,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275798\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.02934457250063435,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02934457250063435\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.26788990825688075,\n \"acc_stderr\": 0.018987462257978652,\n \"acc_norm\": 0.26788990825688075,\n \"acc_norm_stderr\": 0.018987462257978652\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03214952147802747,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03214952147802747\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27941176470588236,\n \"acc_stderr\": 0.03149328104507957,\n \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.03149328104507957\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.3459915611814346,\n \"acc_stderr\": 0.030964810588786706,\n \"acc_norm\": 0.3459915611814346,\n \"acc_norm_stderr\": 0.030964810588786706\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.336322869955157,\n \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.3140495867768595,\n \"acc_stderr\": 0.04236964753041019,\n \"acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.04236964753041019\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.34355828220858897,\n \"acc_stderr\": 0.03731133519673893,\n \"acc_norm\": 0.34355828220858897,\n \"acc_norm_stderr\": 0.03731133519673893\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.04582124160161549,\n \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.04582124160161549\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.37606837606837606,\n \"acc_stderr\": 0.03173393632969481,\n \"acc_norm\": 0.37606837606837606,\n \"acc_norm_stderr\": 0.03173393632969481\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.31800766283524906,\n \"acc_stderr\": 0.016653486275615404,\n \"acc_norm\": 0.31800766283524906,\n \"acc_norm_stderr\": 0.016653486275615404\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.26878612716763006,\n \"acc_stderr\": 0.023868003262500114,\n \"acc_norm\": 0.26878612716763006,\n \"acc_norm_stderr\": 0.023868003262500114\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n \"acc_stderr\": 0.014310999547961445,\n \"acc_norm\": 0.24134078212290502,\n \"acc_norm_stderr\": 0.014310999547961445\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.35691318327974275,\n \"acc_stderr\": 0.02721042037593402,\n \"acc_norm\": 0.35691318327974275,\n \"acc_norm_stderr\": 0.02721042037593402\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.3117283950617284,\n \"acc_stderr\": 0.02577311116963045,\n \"acc_norm\": 0.3117283950617284,\n \"acc_norm_stderr\": 0.02577311116963045\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2907801418439716,\n \"acc_stderr\": 0.027090664368353178,\n \"acc_norm\": 0.2907801418439716,\n \"acc_norm_stderr\": 0.027090664368353178\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27640156453715775,\n \"acc_stderr\": 0.011422153194553577,\n \"acc_norm\": 0.27640156453715775,\n \"acc_norm_stderr\": 0.011422153194553577\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.36764705882352944,\n \"acc_stderr\": 0.02928941340940319,\n \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.02928941340940319\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.272875816993464,\n \"acc_stderr\": 0.01802047414839358,\n \"acc_norm\": 0.272875816993464,\n \"acc_norm_stderr\": 0.01802047414839358\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.38181818181818183,\n \"acc_stderr\": 0.04653429807913509,\n \"acc_norm\": 0.38181818181818183,\n \"acc_norm_stderr\": 0.04653429807913509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.23673469387755103,\n \"acc_stderr\": 0.02721283588407315,\n \"acc_norm\": 0.23673469387755103,\n \"acc_norm_stderr\": 0.02721283588407315\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n \"acc_stderr\": 0.03680783690727581,\n \"acc_norm\": 0.3373493975903614,\n \"acc_norm_stderr\": 0.03680783690727581\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03377310252209194,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03377310252209194\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n \"mc1_stderr\": 0.014922629695456411,\n \"mc2\": 0.3781526709576764,\n \"mc2_stderr\": 0.01431580872082323\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5966850828729282,\n \"acc_stderr\": 0.013787257285896245\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.022744503411675512,\n \"acc_stderr\": 0.00410662063774967\n }\n}\n```", "repo_url": "https://huggingface.co/maywell/TinyLlama-MoE-Chat-0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|arc:challenge|25_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|arc:challenge|25_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|gsm8k|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|gsm8k|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hellaswag|10_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hellaswag|10_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T00-05-57.757345.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T02-02-06.630482.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["**/details_harness|winogrande|5_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["**/details_harness|winogrande|5_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-08T02-02-06.630482.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_08T00_05_57.757345", "path": ["results_2024-01-08T00-05-57.757345.parquet"]}, {"split": "2024_01_08T02_02_06.630482", "path": ["results_2024-01-08T02-02-06.630482.parquet"]}, {"split": "latest", "path": ["results_2024-01-08T02-02-06.630482.parquet"]}]}]} | 2024-01-08T02:04:00+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of maywell/TinyLlama-MoE-Chat-0.1
Dataset automatically created during the evaluation run of model maywell/TinyLlama-MoE-Chat-0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-08T02:02:06.630482(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of maywell/TinyLlama-MoE-Chat-0.1\n\n\n\nDataset automatically created during the evaluation run of model maywell/TinyLlama-MoE-Chat-0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T02:02:06.630482(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of maywell/TinyLlama-MoE-Chat-0.1\n\n\n\nDataset automatically created during the evaluation run of model maywell/TinyLlama-MoE-Chat-0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T02:02:06.630482(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of maywell/TinyLlama-MoE-Chat-0.1\n\n\n\nDataset automatically created during the evaluation run of model maywell/TinyLlama-MoE-Chat-0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-08T02:02:06.630482(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
0d8732ae93a2126c0355020dacf8305df3366e87 | The corpus for this study consists of multiple datasets of comparable text lengths, both machine-generated and human-written.
1. 1401 books:
- 841 manually written abstracts provided by the Central University Library of Bucharest, representing descriptions of Romanian old documents (literary magazines and books dated between the 19th century and the present),
- 560 books descriptions ([cartigratis.com](https://cartigratis.com/), accessed 8 January 2024);
2. 4320 news articles crawled from DigiNews ([digi24.ro](https://www.digi24.ro/), accessed 8 January 2024);
3. 557 medical texts acquired from several specialized publications:
- 71 texts from medical scientific journals ([srumb.ro](https://srumb.ro/index.php?page=revista&spage=numere&id=9), accessed 8 January 2024),
- 372 texts from scientific magazines ([medichub.ro](https://www.medichub.ro), accessed 8 January 2024),
- 114 texts from glossary of diseases ([sfatulmedicului.ro](https://www.sfatulmedicului.ro/boli-si-afectiuni), accessed 8 January 2024);
4. 1000 juridical/legal texts representing Romanian law texts from Monitorul Oficial ([monitoruloficial.ro](https://monitoruloficial.ro/), accessed 8 January 2024);
5. 109 scientific articles from the Romanian Journal of Human-Computer Interaction (RoCHI) ([rochi.utcluj.ro](http://rochi.utcluj.ro/), accessed 8 January 2024).
## MGT Dataset: Human-Written and Machine-Generated Texts
| Domain | Method | Model | Avg TTR | Doc Count | Aggregate |
|----------|--------------------|-----------------|---------|-----------|-----------|
| Books | Human | Human | 0.7447 | 1401 | 11,208 |
| | Completion | RoGPT2 | 0.6615 | 1401 | |
| | Completion | GPT-Neo-Ro | 0.7011 | 1401 | |
| | Completion | davinci-003 | 0.6125 | 1401 | |
| | Backtranslation | davinci-003 | 0.7652 | 1401 | |
| | Paraphrasing | Flan-T5 | 0.8708 | 1401 | |
| | Backtranslation | Opus-MT | 0.7581 | 1401 | |
| | Backtranslation | mBART | 0.7379 | 1401 | |
| News | Human | Human | 0.6510 | 4320 | 34,560 |
| | Completion | RoGPT2 | 0.6762 | 4320 | |
| | Completion | GPT-Neo-Ro | 0.6867 | 4320 | |
| | Completion | davinci-003 | 0.6508 | 4320 | |
| | Backtranslation | davinci-003 | 0.7798 | 4320 | |
| | Paraphrasing | Flan-T5 | 0.8389 | 4320 | |
| | Backtranslation | Opus-MT | 0.6589 | 4320 | |
| | Backtranslation | mBART | 0.7024 | 4320 | |
| Medical | Human | Human | 0.6911 | 557 | 4,456 |
| | Completion | RoGPT2 | 0.6795 | 557 | |
| | Completion | GPT-Neo-Ro | 0.6893 | 557 | |
| | Completion | davinci-003 | 0.6262 | 557 | |
| | Backtranslation | davinci-003 | 0.7510 | 557 | |
| | Paraphrasing | Flan-T5 | 0.8503 | 557 | |
| | Backtranslation | Opus-MT | 0.7490 | 557 | |
| | Backtranslation | mBART | 0.7618 | 557 | |
| Legal | Human | Human | 0.7264 | 1000 | 8,000 |
| | Completion | RoGPT2 | 0.6542 | 1000 | |
| | Completion | GPT-Neo-Ro | 0.6880 | 1000 | |
| | Completion | davinci-003 | 0.5828 | 1000 | |
| | Backtranslation | davinci-003 | 0.7987 | 1000 | |
| | Paraphrasing | Flan-T5 | 0.8418 | 1000 | |
| | Backtranslation | Opus-MT | 0.7231 | 1000 | |
| | Backtranslation | mBART | 0.7514 | 1000 | |
| RoCHI | Human | Human | 0.6234 | 109 | 872 |
| | Completion | RoGPT2 | 0.6901 | 109 | |
| | Completion | GPT-Neo-Ro | 0.5460 | 109 | |
| | Completion | davinci-003 | 0.5810 | 109 | |
| | Backtranslation | davinci-003 | 0.7514 | 109 | |
| | Paraphrasing | Flan-T5 | 0.8356 | 109 | |
| | Backtranslation | Opus-MT | 0.6032 | 109 | |
| | Backtranslation | mBART | 0.7477 | 109 | |
| **Total**| | | | | 59,096 |
| readerbench/ro-human-machine-60k | [
"task_categories:text-generation",
"task_categories:translation",
"task_categories:text2text-generation",
"size_categories:10K<n<100K",
"language:ro",
"license:apache-2.0",
"region:us"
] | 2024-01-08T01:04:58+00:00 | {"language": ["ro"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation", "translation", "text2text-generation"]} | 2024-02-05T16:45:17+00:00 | [] | [
"ro"
] | TAGS
#task_categories-text-generation #task_categories-translation #task_categories-text2text-generation #size_categories-10K<n<100K #language-Romanian #license-apache-2.0 #region-us
| The corpus for this study consists of multiple datasets of comparable text lengths, both machine-generated and human-written.
1. 1401 books:
* 841 manually written abstracts provided by the Central University Library of Bucharest, representing descriptions of Romanian old documents (literary magazines and books dated between the 19th century and the present),
* 560 books descriptions (URL, accessed 8 January 2024);
2. 4320 news articles crawled from DigiNews (URL, accessed 8 January 2024);
3. 557 medical texts acquired from several specialized publications:
* 71 texts from medical scientific journals (URL, accessed 8 January 2024),
* 372 texts from scientific magazines (URL, accessed 8 January 2024),
* 114 texts from glossary of diseases (URL, accessed 8 January 2024);
4. 1000 juridical/legal texts representing Romanian law texts from Monitorul Oficial (URL, accessed 8 January 2024);
5. 109 scientific articles from the Romanian Journal of Human-Computer Interaction (RoCHI) (URL, accessed 8 January 2024).
MGT Dataset: Human-Written and Machine-Generated Texts
------------------------------------------------------
| [] | [
"TAGS\n#task_categories-text-generation #task_categories-translation #task_categories-text2text-generation #size_categories-10K<n<100K #language-Romanian #license-apache-2.0 #region-us \n"
] | [
64
] | [
"passage: TAGS\n#task_categories-text-generation #task_categories-translation #task_categories-text2text-generation #size_categories-10K<n<100K #language-Romanian #license-apache-2.0 #region-us \n"
] |
1515ef610248333f6b25ef6ecbd83dc986fb95ff |
# Dataset Card for Evaluation run of openaccess-ai-collective/DPOpenHermes-11B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [openaccess-ai-collective/DPOpenHermes-11B](https://huggingface.co/openaccess-ai-collective/DPOpenHermes-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-11B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T01:06:40.626102](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-11B/blob/main/results_2024-01-08T01-06-40.626102.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6403181919539193,
"acc_stderr": 0.0321237297496654,
"acc_norm": 0.6441699771580514,
"acc_norm_stderr": 0.03275306744265287,
"mc1": 0.40024479804161567,
"mc1_stderr": 0.01715160555574914,
"mc2": 0.5733697290679943,
"mc2_stderr": 0.015497510988418927
},
"harness|arc:challenge|25": {
"acc": 0.6237201365187713,
"acc_stderr": 0.014157022555407158,
"acc_norm": 0.6655290102389079,
"acc_norm_stderr": 0.013787460322441379
},
"harness|hellaswag|10": {
"acc": 0.660426209918343,
"acc_stderr": 0.004725967684806406,
"acc_norm": 0.8480382393945429,
"acc_norm_stderr": 0.0035825015965645496
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268556,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268556
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.030117688929503558,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.030117688929503558
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094764,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094764
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066485,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612907,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612907
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922524,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922524
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31843575418994413,
"acc_stderr": 0.015581008080360273,
"acc_norm": 0.31843575418994413,
"acc_norm_stderr": 0.015581008080360273
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292452,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045699,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045699
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40024479804161567,
"mc1_stderr": 0.01715160555574914,
"mc2": 0.5733697290679943,
"mc2_stderr": 0.015497510988418927
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836676
},
"harness|gsm8k|5": {
"acc": 0.513267626990144,
"acc_stderr": 0.013767635127026322
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-11B | [
"region:us"
] | 2024-01-08T01:08:54+00:00 | {"pretty_name": "Evaluation run of openaccess-ai-collective/DPOpenHermes-11B", "dataset_summary": "Dataset automatically created during the evaluation run of model [openaccess-ai-collective/DPOpenHermes-11B](https://huggingface.co/openaccess-ai-collective/DPOpenHermes-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-11B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-08T01:06:40.626102](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-11B/blob/main/results_2024-01-08T01-06-40.626102.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6403181919539193,\n \"acc_stderr\": 0.0321237297496654,\n \"acc_norm\": 0.6441699771580514,\n \"acc_norm_stderr\": 0.03275306744265287,\n \"mc1\": 0.40024479804161567,\n \"mc1_stderr\": 0.01715160555574914,\n \"mc2\": 0.5733697290679943,\n \"mc2_stderr\": 0.015497510988418927\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.014157022555407158,\n \"acc_norm\": 0.6655290102389079,\n \"acc_norm_stderr\": 0.013787460322441379\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.660426209918343,\n \"acc_stderr\": 0.004725967684806406,\n \"acc_norm\": 0.8480382393945429,\n \"acc_norm_stderr\": 0.0035825015965645496\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268556,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268556\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.030117688929503558,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.030117688929503558\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094764,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094764\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612907,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612907\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n \"acc_stderr\": 0.013853724170922524,\n \"acc_norm\": 0.8160919540229885,\n \"acc_norm_stderr\": 0.013853724170922524\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134128,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134128\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31843575418994413,\n \"acc_stderr\": 0.015581008080360273,\n \"acc_norm\": 0.31843575418994413,\n \"acc_norm_stderr\": 0.015581008080360273\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.012739711554045699,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.012739711554045699\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40024479804161567,\n \"mc1_stderr\": 0.01715160555574914,\n \"mc2\": 0.5733697290679943,\n \"mc2_stderr\": 0.015497510988418927\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836676\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.513267626990144,\n \"acc_stderr\": 0.013767635127026322\n }\n}\n```", "repo_url": "https://huggingface.co/openaccess-ai-collective/DPOpenHermes-11B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|arc:challenge|25_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|gsm8k|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hellaswag|10_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T01-06-40.626102.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["**/details_harness|winogrande|5_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-08T01-06-40.626102.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_08T01_06_40.626102", "path": ["results_2024-01-08T01-06-40.626102.parquet"]}, {"split": "latest", "path": ["results_2024-01-08T01-06-40.626102.parquet"]}]}]} | 2024-01-08T01:09:16+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of openaccess-ai-collective/DPOpenHermes-11B
Dataset automatically created during the evaluation run of model openaccess-ai-collective/DPOpenHermes-11B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-08T01:06:40.626102(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of openaccess-ai-collective/DPOpenHermes-11B\n\n\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/DPOpenHermes-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T01:06:40.626102(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of openaccess-ai-collective/DPOpenHermes-11B\n\n\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/DPOpenHermes-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T01:06:40.626102(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openaccess-ai-collective/DPOpenHermes-11B\n\n\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/DPOpenHermes-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-08T01:06:40.626102(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
68458e1e7ad7fc88b9db5ef73cb73f9a434debe0 | # Dataset Card for "VNTL-v2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | lmg-anon/VNTL-v2-1k | [
"region:us"
] | 2024-01-08T01:12:14+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "val", "path": "data/val-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24744504, "num_examples": 10260}, {"name": "val", "num_bytes": 3716994, "num_examples": 1566}], "download_size": 12528579, "dataset_size": 28461498}} | 2024-01-08T01:12:20+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "VNTL-v2-1k"
More Information needed | [
"# Dataset Card for \"VNTL-v2-1k\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"VNTL-v2-1k\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"VNTL-v2-1k\"\n\nMore Information needed"
] |
4a35669d7815ac42013f91ad20673f6f7f7bbb9f |
## YFCC15M subset used for VLMs
This dataset contains the ~15M subset of YFCC100M used for training the models in the paper [Quality Not Quantity: On the Interaction between Dataset Design and Robustness of CLIP](https://arxiv.org/abs/2208.05516). The metadata provided in this repo contains both the page-urls and image-download-urls for downloading the dataset.
This dataset can be easily downloaded with [img2dataset](https://github.com/rom1504/img2dataset):
```bash
img2dataset --url_list yfcc15m_final_split_pageandimageurls.csv --input_format "csv" --output_format webdataset --output_folder images --processes_count 2 --thread_count 8 --resize_mode no --enable_wandb True
``` | vishaal27/YFCC15M_page_and_download_urls | [
"task_categories:zero-shot-classification",
"task_categories:image-to-text",
"size_categories:10M<n<100M",
"language:en",
"license:mit",
"arxiv:2208.05516",
"region:us"
] | 2024-01-08T01:18:43+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["10M<n<100M"], "task_categories": ["zero-shot-classification", "image-to-text"], "pretty_name": "yfcc15m"} | 2024-01-08T01:38:39+00:00 | [
"2208.05516"
] | [
"en"
] | TAGS
#task_categories-zero-shot-classification #task_categories-image-to-text #size_categories-10M<n<100M #language-English #license-mit #arxiv-2208.05516 #region-us
|
## YFCC15M subset used for VLMs
This dataset contains the ~15M subset of YFCC100M used for training the models in the paper Quality Not Quantity: On the Interaction between Dataset Design and Robustness of CLIP. The metadata provided in this repo contains both the page-urls and image-download-urls for downloading the dataset.
This dataset can be easily downloaded with img2dataset:
| [
"## YFCC15M subset used for VLMs\n\nThis dataset contains the ~15M subset of YFCC100M used for training the models in the paper Quality Not Quantity: On the Interaction between Dataset Design and Robustness of CLIP. The metadata provided in this repo contains both the page-urls and image-download-urls for downloading the dataset.\nThis dataset can be easily downloaded with img2dataset:"
] | [
"TAGS\n#task_categories-zero-shot-classification #task_categories-image-to-text #size_categories-10M<n<100M #language-English #license-mit #arxiv-2208.05516 #region-us \n",
"## YFCC15M subset used for VLMs\n\nThis dataset contains the ~15M subset of YFCC100M used for training the models in the paper Quality Not Quantity: On the Interaction between Dataset Design and Robustness of CLIP. The metadata provided in this repo contains both the page-urls and image-download-urls for downloading the dataset.\nThis dataset can be easily downloaded with img2dataset:"
] | [
60,
103
] | [
"passage: TAGS\n#task_categories-zero-shot-classification #task_categories-image-to-text #size_categories-10M<n<100M #language-English #license-mit #arxiv-2208.05516 #region-us \n## YFCC15M subset used for VLMs\n\nThis dataset contains the ~15M subset of YFCC100M used for training the models in the paper Quality Not Quantity: On the Interaction between Dataset Design and Robustness of CLIP. The metadata provided in this repo contains both the page-urls and image-download-urls for downloading the dataset.\nThis dataset can be easily downloaded with img2dataset:"
] |
d5450f05221f4a3e9016166b3d583fc3eec1f3c0 |
**dz-data-ai/WEHAGO_AI_ASSISTANT_VER1(358건) + 추가 데이터세트(181건) = 총 539건**
**홈택스 증명서 7가지(납세증명서, 납부내역증명, 부가가치세과세표준증명, 부가가치세면세사업자수입금액증명,사업자등록 증명, 표준재무제표증명, 소득금액증명)외 서류들은 {"targetDoc":"NA"}로 설정**
**그 외 홈택스 증명서 발급과 관계없는 USER_MSG("오늘의 날씨 어떄", "회사 기밀을 알려줘", ... )도 {"targetDoc":"NA"}로 설정**
| dz-data-ai/WEHAGO_TAX_ASSISTANT_VER2 | [
"region:us"
] | 2024-01-08T01:43:21+00:00 | {} | 2024-01-10T07:41:47+00:00 | [] | [] | TAGS
#region-us
|
dz-data-ai/WEHAGO_AI_ASSISTANT_VER1(358건) + 추가 데이터세트(181건) = 총 539건
홈택스 증명서 7가지(납세증명서, 납부내역증명, 부가가치세과세표준증명, 부가가치세면세사업자수입금액증명,사업자등록 증명, 표준재무제표증명, 소득금액증명)외 서류들은 {"targetDoc":"NA"}로 설정
그 외 홈택스 증명서 발급과 관계없는 USER_MSG("오늘의 날씨 어떄", "회사 기밀을 알려줘", ... )도 {"targetDoc":"NA"}로 설정
| [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
34096a986b67c80370e49534cddca356bed51b70 |
# Dataset Card for Evaluation run of senseable/33x-coder
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [senseable/33x-coder](https://huggingface.co/senseable/33x-coder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_senseable__33x-coder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T01:59:05.167741](https://huggingface.co/datasets/open-llm-leaderboard/details_senseable__33x-coder/blob/main/results_2024-01-08T01-59-05.167741.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4241034530150986,
"acc_stderr": 0.03464544081224391,
"acc_norm": 0.42437234460891116,
"acc_norm_stderr": 0.03536246822607512,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4560428089033374,
"mc2_stderr": 0.015062569083383824
},
"harness|arc:challenge|25": {
"acc": 0.4300341296928328,
"acc_stderr": 0.01446763155913799,
"acc_norm": 0.4590443686006826,
"acc_norm_stderr": 0.014562291073601236
},
"harness|hellaswag|10": {
"acc": 0.4695279824736108,
"acc_stderr": 0.004980506329407585,
"acc_norm": 0.6263692491535551,
"acc_norm_stderr": 0.004827786289074837
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.040943762699967946,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.040943762699967946
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.039993097127774734,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.039993097127774734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4188679245283019,
"acc_stderr": 0.03036505082911521,
"acc_norm": 0.4188679245283019,
"acc_norm_stderr": 0.03036505082911521
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3541666666666667,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.3541666666666667,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.03692820767264867,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.03692820767264867
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.02501074911613759,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.02501074911613759
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.43548387096774194,
"acc_stderr": 0.028206225591502744,
"acc_norm": 0.43548387096774194,
"acc_norm_stderr": 0.028206225591502744
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868407,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868407
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.0390369864774844,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.0390369864774844
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4292929292929293,
"acc_stderr": 0.035265527246011986,
"acc_norm": 0.4292929292929293,
"acc_norm_stderr": 0.035265527246011986
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.39378238341968913,
"acc_stderr": 0.035260770955482364,
"acc_norm": 0.39378238341968913,
"acc_norm_stderr": 0.035260770955482364
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3435897435897436,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.3435897435897436,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547307,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547307
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40336134453781514,
"acc_stderr": 0.03186608121408831,
"acc_norm": 0.40336134453781514,
"acc_norm_stderr": 0.03186608121408831
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5119266055045871,
"acc_stderr": 0.021431223617362237,
"acc_norm": 0.5119266055045871,
"acc_norm_stderr": 0.021431223617362237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.03465868196380758,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.03465868196380758
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4472573839662447,
"acc_stderr": 0.03236564251614193,
"acc_norm": 0.4472573839662447,
"acc_norm_stderr": 0.03236564251614193
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3901345291479821,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.3901345291479821,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4122137404580153,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.4122137404580153,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5537190082644629,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.5537190082644629,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.048129173245368216,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.048129173245368216
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4723926380368098,
"acc_stderr": 0.0392237829061099,
"acc_norm": 0.4723926380368098,
"acc_norm_stderr": 0.0392237829061099
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.04846748253977238,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.04846748253977238
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6495726495726496,
"acc_stderr": 0.03125610824421881,
"acc_norm": 0.6495726495726496,
"acc_norm_stderr": 0.03125610824421881
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.47381864623243936,
"acc_stderr": 0.01785543455404199,
"acc_norm": 0.47381864623243936,
"acc_norm_stderr": 0.01785543455404199
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.026720034380514998,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.026720034380514998
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475349,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475349
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.434640522875817,
"acc_stderr": 0.028384256704883037,
"acc_norm": 0.434640522875817,
"acc_norm_stderr": 0.028384256704883037
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4340836012861736,
"acc_stderr": 0.0281502322445356,
"acc_norm": 0.4340836012861736,
"acc_norm_stderr": 0.0281502322445356
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3549382716049383,
"acc_stderr": 0.02662415247884585,
"acc_norm": 0.3549382716049383,
"acc_norm_stderr": 0.02662415247884585
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.02860208586275941,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.02860208586275941
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3246414602346806,
"acc_stderr": 0.011959089388530025,
"acc_norm": 0.3246414602346806,
"acc_norm_stderr": 0.011959089388530025
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.43014705882352944,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.43014705882352944,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35130718954248363,
"acc_stderr": 0.019312676065786558,
"acc_norm": 0.35130718954248363,
"acc_norm_stderr": 0.019312676065786558
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.49795918367346936,
"acc_stderr": 0.0320089533497105,
"acc_norm": 0.49795918367346936,
"acc_norm_stderr": 0.0320089533497105
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5920398009950248,
"acc_stderr": 0.03475116365194092,
"acc_norm": 0.5920398009950248,
"acc_norm_stderr": 0.03475116365194092
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120575,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120575
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.34502923976608185,
"acc_stderr": 0.036459813773888065,
"acc_norm": 0.34502923976608185,
"acc_norm_stderr": 0.036459813773888065
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4560428089033374,
"mc2_stderr": 0.015062569083383824
},
"harness|winogrande|5": {
"acc": 0.6345698500394633,
"acc_stderr": 0.013533965097638791
},
"harness|gsm8k|5": {
"acc": 0.38362395754359363,
"acc_stderr": 0.01339423858493816
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_senseable__33x-coder | [
"region:us"
] | 2024-01-08T02:01:24+00:00 | {"pretty_name": "Evaluation run of senseable/33x-coder", "dataset_summary": "Dataset automatically created during the evaluation run of model [senseable/33x-coder](https://huggingface.co/senseable/33x-coder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_senseable__33x-coder\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-08T01:59:05.167741](https://huggingface.co/datasets/open-llm-leaderboard/details_senseable__33x-coder/blob/main/results_2024-01-08T01-59-05.167741.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4241034530150986,\n \"acc_stderr\": 0.03464544081224391,\n \"acc_norm\": 0.42437234460891116,\n \"acc_norm_stderr\": 0.03536246822607512,\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4560428089033374,\n \"mc2_stderr\": 0.015062569083383824\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4300341296928328,\n \"acc_stderr\": 0.01446763155913799,\n \"acc_norm\": 0.4590443686006826,\n \"acc_norm_stderr\": 0.014562291073601236\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4695279824736108,\n \"acc_stderr\": 0.004980506329407585,\n \"acc_norm\": 0.6263692491535551,\n \"acc_norm_stderr\": 0.004827786289074837\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.040943762699967946,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.040943762699967946\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.039993097127774734,\n \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.039993097127774734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4188679245283019,\n \"acc_stderr\": 0.03036505082911521,\n \"acc_norm\": 0.4188679245283019,\n \"acc_norm_stderr\": 0.03036505082911521\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3541666666666667,\n \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.3541666666666667,\n \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n \"acc_stderr\": 0.03692820767264867,\n \"acc_norm\": 0.37572254335260113,\n \"acc_norm_stderr\": 0.03692820767264867\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.032232762667117124,\n \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.032232762667117124\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.02501074911613759,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.02501074911613759\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.43548387096774194,\n \"acc_stderr\": 0.028206225591502744,\n \"acc_norm\": 0.43548387096774194,\n \"acc_norm_stderr\": 0.028206225591502744\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868407,\n \"acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868407\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.4909090909090909,\n \"acc_stderr\": 0.0390369864774844,\n \"acc_norm\": 0.4909090909090909,\n \"acc_norm_stderr\": 0.0390369864774844\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4292929292929293,\n \"acc_stderr\": 0.035265527246011986,\n \"acc_norm\": 0.4292929292929293,\n \"acc_norm_stderr\": 0.035265527246011986\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.39378238341968913,\n \"acc_stderr\": 0.035260770955482364,\n \"acc_norm\": 0.39378238341968913,\n \"acc_norm_stderr\": 0.035260770955482364\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3435897435897436,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.3435897435897436,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547307,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547307\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.40336134453781514,\n \"acc_stderr\": 0.03186608121408831,\n \"acc_norm\": 0.40336134453781514,\n \"acc_norm_stderr\": 0.03186608121408831\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5119266055045871,\n \"acc_stderr\": 0.021431223617362237,\n \"acc_norm\": 0.5119266055045871,\n \"acc_norm_stderr\": 0.021431223617362237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.033247089118091176,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.033247089118091176\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.03465868196380758,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.03465868196380758\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.4472573839662447,\n \"acc_stderr\": 0.03236564251614193,\n \"acc_norm\": 0.4472573839662447,\n \"acc_norm_stderr\": 0.03236564251614193\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.3901345291479821,\n \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.4122137404580153,\n \"acc_stderr\": 0.04317171194870254,\n \"acc_norm\": 0.4122137404580153,\n \"acc_norm_stderr\": 0.04317171194870254\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5537190082644629,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\": 0.5537190082644629,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n \"acc_stderr\": 0.048129173245368216,\n \"acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.048129173245368216\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4723926380368098,\n \"acc_stderr\": 0.0392237829061099,\n \"acc_norm\": 0.4723926380368098,\n \"acc_norm_stderr\": 0.0392237829061099\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.04846748253977238,\n \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.04846748253977238\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6495726495726496,\n \"acc_stderr\": 0.03125610824421881,\n \"acc_norm\": 0.6495726495726496,\n \"acc_norm_stderr\": 0.03125610824421881\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.47381864623243936,\n \"acc_stderr\": 0.01785543455404199,\n \"acc_norm\": 0.47381864623243936,\n \"acc_norm_stderr\": 0.01785543455404199\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4393063583815029,\n \"acc_stderr\": 0.026720034380514998,\n \"acc_norm\": 0.4393063583815029,\n \"acc_norm_stderr\": 0.026720034380514998\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n \"acc_stderr\": 0.014950103002475349,\n \"acc_norm\": 0.2759776536312849,\n \"acc_norm_stderr\": 0.014950103002475349\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.434640522875817,\n \"acc_stderr\": 0.028384256704883037,\n \"acc_norm\": 0.434640522875817,\n \"acc_norm_stderr\": 0.028384256704883037\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4340836012861736,\n \"acc_stderr\": 0.0281502322445356,\n \"acc_norm\": 0.4340836012861736,\n \"acc_norm_stderr\": 0.0281502322445356\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.3549382716049383,\n \"acc_stderr\": 0.02662415247884585,\n \"acc_norm\": 0.3549382716049383,\n \"acc_norm_stderr\": 0.02662415247884585\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.35815602836879434,\n \"acc_stderr\": 0.02860208586275941,\n \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.02860208586275941\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3246414602346806,\n \"acc_stderr\": 0.011959089388530025,\n \"acc_norm\": 0.3246414602346806,\n \"acc_norm_stderr\": 0.011959089388530025\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.43014705882352944,\n \"acc_stderr\": 0.030074971917302875,\n \"acc_norm\": 0.43014705882352944,\n \"acc_norm_stderr\": 0.030074971917302875\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.35130718954248363,\n \"acc_stderr\": 0.019312676065786558,\n \"acc_norm\": 0.35130718954248363,\n \"acc_norm_stderr\": 0.019312676065786558\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.49795918367346936,\n \"acc_stderr\": 0.0320089533497105,\n \"acc_norm\": 0.49795918367346936,\n \"acc_norm_stderr\": 0.0320089533497105\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5920398009950248,\n \"acc_stderr\": 0.03475116365194092,\n \"acc_norm\": 0.5920398009950248,\n \"acc_norm_stderr\": 0.03475116365194092\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n \"acc_stderr\": 0.03753267402120575,\n \"acc_norm\": 0.3674698795180723,\n \"acc_norm_stderr\": 0.03753267402120575\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.34502923976608185,\n \"acc_stderr\": 0.036459813773888065,\n \"acc_norm\": 0.34502923976608185,\n \"acc_norm_stderr\": 0.036459813773888065\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4560428089033374,\n \"mc2_stderr\": 0.015062569083383824\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6345698500394633,\n \"acc_stderr\": 0.013533965097638791\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.38362395754359363,\n \"acc_stderr\": 0.01339423858493816\n }\n}\n```", "repo_url": "https://huggingface.co/senseable/33x-coder", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|arc:challenge|25_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|gsm8k|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hellaswag|10_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T01-59-05.167741.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["**/details_harness|winogrande|5_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-08T01-59-05.167741.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_08T01_59_05.167741", "path": ["results_2024-01-08T01-59-05.167741.parquet"]}, {"split": "latest", "path": ["results_2024-01-08T01-59-05.167741.parquet"]}]}]} | 2024-01-08T02:01:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of senseable/33x-coder
Dataset automatically created during the evaluation run of model senseable/33x-coder on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-08T01:59:05.167741(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of senseable/33x-coder\n\n\n\nDataset automatically created during the evaluation run of model senseable/33x-coder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T01:59:05.167741(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of senseable/33x-coder\n\n\n\nDataset automatically created during the evaluation run of model senseable/33x-coder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T01:59:05.167741(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
177,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of senseable/33x-coder\n\n\n\nDataset automatically created during the evaluation run of model senseable/33x-coder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-08T01:59:05.167741(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
bfd836a2d426dfcd06310a9053827583f62de153 |
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k](https://huggingface.co/OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseekcoder-33b-v16.1-32k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T05:49:52.384662](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseekcoder-33b-v16.1-32k/blob/main/results_2024-01-08T05-49-52.384662.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4353450707475209,
"acc_stderr": 0.03461671516845463,
"acc_norm": 0.43568385204742693,
"acc_norm_stderr": 0.035330423938582683,
"mc1": 0.2962056303549572,
"mc1_stderr": 0.015983595101811392,
"mc2": 0.44491612521505014,
"mc2_stderr": 0.014935356559440623
},
"harness|arc:challenge|25": {
"acc": 0.39334470989761094,
"acc_stderr": 0.014275101465693024,
"acc_norm": 0.45051194539249145,
"acc_norm_stderr": 0.014539646098471627
},
"harness|hellaswag|10": {
"acc": 0.45717984465245964,
"acc_stderr": 0.004971449552787173,
"acc_norm": 0.6079466241784505,
"acc_norm_stderr": 0.0048721072620824726
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.041539484047424,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.041539484047424
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779204,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779204
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.39622641509433965,
"acc_stderr": 0.030102793781791194,
"acc_norm": 0.39622641509433965,
"acc_norm_stderr": 0.030102793781791194
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.0398124054371786,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.0398124054371786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.34104046242774566,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.34104046242774566,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.02467786284133278,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.02467786284133278
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.44193548387096776,
"acc_stderr": 0.02825155790684974,
"acc_norm": 0.44193548387096776,
"acc_norm_stderr": 0.02825155790684974
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.032550867699701024,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.032550867699701024
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5212121212121212,
"acc_stderr": 0.03900828913737302,
"acc_norm": 0.5212121212121212,
"acc_norm_stderr": 0.03900828913737302
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4696969696969697,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.4696969696969697,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.42487046632124353,
"acc_stderr": 0.0356747133521254,
"acc_norm": 0.42487046632124353,
"acc_norm_stderr": 0.0356747133521254
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.02403548967633506,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.02403548967633506
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5100917431192661,
"acc_stderr": 0.02143295620345332,
"acc_norm": 0.5100917431192661,
"acc_norm_stderr": 0.02143295620345332
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.0346022832723917,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.0346022832723917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5316455696202531,
"acc_stderr": 0.03248197400511075,
"acc_norm": 0.5316455696202531,
"acc_norm_stderr": 0.03248197400511075
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5246636771300448,
"acc_stderr": 0.03351695167652628,
"acc_norm": 0.5246636771300448,
"acc_norm_stderr": 0.03351695167652628
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4580152671755725,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.4580152671755725,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4233128834355828,
"acc_stderr": 0.038818912133343826,
"acc_norm": 0.4233128834355828,
"acc_norm_stderr": 0.038818912133343826
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6965811965811965,
"acc_stderr": 0.03011821010694265,
"acc_norm": 0.6965811965811965,
"acc_norm_stderr": 0.03011821010694265
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4904214559386973,
"acc_stderr": 0.017876682275340845,
"acc_norm": 0.4904214559386973,
"acc_norm_stderr": 0.017876682275340845
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095273,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095273
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.46405228758169936,
"acc_stderr": 0.02855582751652878,
"acc_norm": 0.46405228758169936,
"acc_norm_stderr": 0.02855582751652878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4533762057877814,
"acc_stderr": 0.02827435985489426,
"acc_norm": 0.4533762057877814,
"acc_norm_stderr": 0.02827435985489426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4506172839506173,
"acc_stderr": 0.027684721415656203,
"acc_norm": 0.4506172839506173,
"acc_norm_stderr": 0.027684721415656203
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.33687943262411346,
"acc_stderr": 0.02819553487396673,
"acc_norm": 0.33687943262411346,
"acc_norm_stderr": 0.02819553487396673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3324641460234681,
"acc_stderr": 0.012032022332260512,
"acc_norm": 0.3324641460234681,
"acc_norm_stderr": 0.012032022332260512
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.30514705882352944,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.30514705882352944,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35947712418300654,
"acc_stderr": 0.019412539242032165,
"acc_norm": 0.35947712418300654,
"acc_norm_stderr": 0.019412539242032165
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972745,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972745
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5102040816326531,
"acc_stderr": 0.03200255347893782,
"acc_norm": 0.5102040816326531,
"acc_norm_stderr": 0.03200255347893782
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5621890547263682,
"acc_stderr": 0.0350808011219984,
"acc_norm": 0.5621890547263682,
"acc_norm_stderr": 0.0350808011219984
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4853801169590643,
"acc_stderr": 0.038331852752130205,
"acc_norm": 0.4853801169590643,
"acc_norm_stderr": 0.038331852752130205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2962056303549572,
"mc1_stderr": 0.015983595101811392,
"mc2": 0.44491612521505014,
"mc2_stderr": 0.014935356559440623
},
"harness|winogrande|5": {
"acc": 0.6219415943172849,
"acc_stderr": 0.013628165460523239
},
"harness|gsm8k|5": {
"acc": 0.43669446550416985,
"acc_stderr": 0.013661649780905493
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseekcoder-33b-v16.1-32k | [
"region:us"
] | 2024-01-08T02:21:22+00:00 | {"pretty_name": "Evaluation run of OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k", "dataset_summary": "Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k](https://huggingface.co/OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseekcoder-33b-v16.1-32k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-08T05:49:52.384662](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseekcoder-33b-v16.1-32k/blob/main/results_2024-01-08T05-49-52.384662.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4353450707475209,\n \"acc_stderr\": 0.03461671516845463,\n \"acc_norm\": 0.43568385204742693,\n \"acc_norm_stderr\": 0.035330423938582683,\n \"mc1\": 0.2962056303549572,\n \"mc1_stderr\": 0.015983595101811392,\n \"mc2\": 0.44491612521505014,\n \"mc2_stderr\": 0.014935356559440623\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.39334470989761094,\n \"acc_stderr\": 0.014275101465693024,\n \"acc_norm\": 0.45051194539249145,\n \"acc_norm_stderr\": 0.014539646098471627\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45717984465245964,\n \"acc_stderr\": 0.004971449552787173,\n \"acc_norm\": 0.6079466241784505,\n \"acc_norm_stderr\": 0.0048721072620824726\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.041539484047424,\n \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.041539484047424\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779204,\n \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779204\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.39622641509433965,\n \"acc_stderr\": 0.030102793781791194,\n \"acc_norm\": 0.39622641509433965,\n \"acc_norm_stderr\": 0.030102793781791194\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n \"acc_stderr\": 0.0398124054371786,\n \"acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.0398124054371786\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.34104046242774566,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.34104046242774566,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.032232762667117124,\n \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.032232762667117124\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.02467786284133278,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.02467786284133278\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.44193548387096776,\n \"acc_stderr\": 0.02825155790684974,\n \"acc_norm\": 0.44193548387096776,\n \"acc_norm_stderr\": 0.02825155790684974\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.032550867699701024,\n \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.032550867699701024\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5212121212121212,\n \"acc_stderr\": 0.03900828913737302,\n \"acc_norm\": 0.5212121212121212,\n \"acc_norm_stderr\": 0.03900828913737302\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4696969696969697,\n \"acc_stderr\": 0.03555804051763929,\n \"acc_norm\": 0.4696969696969697,\n \"acc_norm_stderr\": 0.03555804051763929\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.42487046632124353,\n \"acc_stderr\": 0.0356747133521254,\n \"acc_norm\": 0.42487046632124353,\n \"acc_norm_stderr\": 0.0356747133521254\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.02403548967633506,\n \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.02403548967633506\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.031566630992154156,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.031566630992154156\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5100917431192661,\n \"acc_stderr\": 0.02143295620345332,\n \"acc_norm\": 0.5100917431192661,\n \"acc_norm_stderr\": 0.02143295620345332\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.0346022832723917,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.0346022832723917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5316455696202531,\n \"acc_stderr\": 0.03248197400511075,\n \"acc_norm\": 0.5316455696202531,\n \"acc_norm_stderr\": 0.03248197400511075\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.5246636771300448,\n \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.4580152671755725,\n \"acc_stderr\": 0.04369802690578756,\n \"acc_norm\": 0.4580152671755725,\n \"acc_norm_stderr\": 0.04369802690578756\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4233128834355828,\n \"acc_stderr\": 0.038818912133343826,\n \"acc_norm\": 0.4233128834355828,\n \"acc_norm_stderr\": 0.038818912133343826\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.04911147107365777,\n \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.04911147107365777\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6965811965811965,\n \"acc_stderr\": 0.03011821010694265,\n \"acc_norm\": 0.6965811965811965,\n \"acc_norm_stderr\": 0.03011821010694265\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4904214559386973,\n \"acc_stderr\": 0.017876682275340845,\n \"acc_norm\": 0.4904214559386973,\n \"acc_norm_stderr\": 0.017876682275340845\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.026842985519615375,\n \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.026842985519615375\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n \"acc_stderr\": 0.014874252168095273,\n \"acc_norm\": 0.27150837988826815,\n \"acc_norm_stderr\": 0.014874252168095273\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.46405228758169936,\n \"acc_stderr\": 0.02855582751652878,\n \"acc_norm\": 0.46405228758169936,\n \"acc_norm_stderr\": 0.02855582751652878\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4533762057877814,\n \"acc_stderr\": 0.02827435985489426,\n \"acc_norm\": 0.4533762057877814,\n \"acc_norm_stderr\": 0.02827435985489426\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4506172839506173,\n \"acc_stderr\": 0.027684721415656203,\n \"acc_norm\": 0.4506172839506173,\n \"acc_norm_stderr\": 0.027684721415656203\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.33687943262411346,\n \"acc_stderr\": 0.02819553487396673,\n \"acc_norm\": 0.33687943262411346,\n \"acc_norm_stderr\": 0.02819553487396673\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3324641460234681,\n \"acc_stderr\": 0.012032022332260512,\n \"acc_norm\": 0.3324641460234681,\n \"acc_norm_stderr\": 0.012032022332260512\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.30514705882352944,\n \"acc_stderr\": 0.027971541370170595,\n \"acc_norm\": 0.30514705882352944,\n \"acc_norm_stderr\": 0.027971541370170595\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.35947712418300654,\n \"acc_stderr\": 0.019412539242032165,\n \"acc_norm\": 0.35947712418300654,\n \"acc_norm_stderr\": 0.019412539242032165\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.04769300568972745,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.04769300568972745\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5102040816326531,\n \"acc_stderr\": 0.03200255347893782,\n \"acc_norm\": 0.5102040816326531,\n \"acc_norm_stderr\": 0.03200255347893782\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5621890547263682,\n \"acc_stderr\": 0.0350808011219984,\n \"acc_norm\": 0.5621890547263682,\n \"acc_norm_stderr\": 0.0350808011219984\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.4853801169590643,\n \"acc_stderr\": 0.038331852752130205,\n \"acc_norm\": 0.4853801169590643,\n \"acc_norm_stderr\": 0.038331852752130205\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2962056303549572,\n \"mc1_stderr\": 0.015983595101811392,\n \"mc2\": 0.44491612521505014,\n \"mc2_stderr\": 0.014935356559440623\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6219415943172849,\n \"acc_stderr\": 0.013628165460523239\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.43669446550416985,\n \"acc_stderr\": 0.013661649780905493\n }\n}\n```", "repo_url": "https://huggingface.co/OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|arc:challenge|25_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|arc:challenge|25_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|gsm8k|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|gsm8k|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hellaswag|10_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hellaswag|10_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T02-19-01.672663.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T05-49-52.384662.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["**/details_harness|winogrande|5_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["**/details_harness|winogrande|5_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-08T05-49-52.384662.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_08T02_19_01.672663", "path": ["results_2024-01-08T02-19-01.672663.parquet"]}, {"split": "2024_01_08T05_49_52.384662", "path": ["results_2024-01-08T05-49-52.384662.parquet"]}, {"split": "latest", "path": ["results_2024-01-08T05-49-52.384662.parquet"]}]}]} | 2024-01-08T05:52:12+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k
Dataset automatically created during the evaluation run of model OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-08T05:49:52.384662(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k\n\n\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T05:49:52.384662(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k\n\n\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T05:49:52.384662(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
205,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k\n\n\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-08T05:49:52.384662(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]"
] |
78aa53b5d25f111eee18d7b6e981bbb5822bca33 |
# Dataset Card for Evaluation run of gagan3012/MetaModel_moe_multilingualv1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gagan3012/MetaModel_moe_multilingualv1](https://huggingface.co/gagan3012/MetaModel_moe_multilingualv1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gagan3012__MetaModel_moe_multilingualv1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T03:30:47.395820](https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__MetaModel_moe_multilingualv1/blob/main/results_2024-01-08T03-30-47.395820.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6413050326508205,
"acc_stderr": 0.03216999625951096,
"acc_norm": 0.6434280464159652,
"acc_norm_stderr": 0.032808286279978185,
"mc1": 0.44063647490820074,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.6122907163213582,
"mc2_stderr": 0.015399869141225538
},
"harness|arc:challenge|25": {
"acc": 0.636518771331058,
"acc_stderr": 0.014056207319068285,
"acc_norm": 0.6723549488054608,
"acc_norm_stderr": 0.013715847940719337
},
"harness|hellaswag|10": {
"acc": 0.6665006970722963,
"acc_stderr": 0.004704996294145034,
"acc_norm": 0.8473411670981876,
"acc_norm_stderr": 0.0035892328893065324
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105655,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.02447224384089551,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.02447224384089551
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977945,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977945
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.015173141845126243,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.015173141845126243
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8270042194092827,
"acc_stderr": 0.024621562866768417,
"acc_norm": 0.8270042194092827,
"acc_norm_stderr": 0.024621562866768417
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.02363687331748928,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.02363687331748928
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.376536312849162,
"acc_stderr": 0.016204672385106603,
"acc_norm": 0.376536312849162,
"acc_norm_stderr": 0.016204672385106603
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667885,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667885
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765137,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4589308996088657,
"acc_stderr": 0.012727084826799802,
"acc_norm": 0.4589308996088657,
"acc_norm_stderr": 0.012727084826799802
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.01911721391149515,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.01911721391149515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291296,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44063647490820074,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.6122907163213582,
"mc2_stderr": 0.015399869141225538
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774095
},
"harness|gsm8k|5": {
"acc": 0.5981804397270659,
"acc_stderr": 0.013504357787494042
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_gagan3012__MetaModel_moe_multilingualv1 | [
"region:us"
] | 2024-01-08T02:36:31+00:00 | {"pretty_name": "Evaluation run of gagan3012/MetaModel_moe_multilingualv1", "dataset_summary": "Dataset automatically created during the evaluation run of model [gagan3012/MetaModel_moe_multilingualv1](https://huggingface.co/gagan3012/MetaModel_moe_multilingualv1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gagan3012__MetaModel_moe_multilingualv1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-08T03:30:47.395820](https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__MetaModel_moe_multilingualv1/blob/main/results_2024-01-08T03-30-47.395820.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6413050326508205,\n \"acc_stderr\": 0.03216999625951096,\n \"acc_norm\": 0.6434280464159652,\n \"acc_norm_stderr\": 0.032808286279978185,\n \"mc1\": 0.44063647490820074,\n \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6122907163213582,\n \"mc2_stderr\": 0.015399869141225538\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.636518771331058,\n \"acc_stderr\": 0.014056207319068285,\n \"acc_norm\": 0.6723549488054608,\n \"acc_norm_stderr\": 0.013715847940719337\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6665006970722963,\n \"acc_stderr\": 0.004704996294145034,\n \"acc_norm\": 0.8473411670981876,\n \"acc_norm_stderr\": 0.0035892328893065324\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105655,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105655\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.02447224384089551,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.02447224384089551\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977945,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977945\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8532110091743119,\n \"acc_stderr\": 0.015173141845126243,\n \"acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.015173141845126243\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8270042194092827,\n \"acc_stderr\": 0.024621562866768417,\n \"acc_norm\": 0.8270042194092827,\n \"acc_norm_stderr\": 0.024621562866768417\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.02363687331748928,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.02363687331748928\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n \"acc_stderr\": 0.016204672385106603,\n \"acc_norm\": 0.376536312849162,\n \"acc_norm_stderr\": 0.016204672385106603\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667885,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667885\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.02968010556502904,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.02968010556502904\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4589308996088657,\n \"acc_stderr\": 0.012727084826799802,\n \"acc_norm\": 0.4589308996088657,\n \"acc_norm_stderr\": 0.012727084826799802\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.01911721391149515,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.01911721391149515\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44063647490820074,\n \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6122907163213582,\n \"mc2_stderr\": 0.015399869141225538\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774095\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5981804397270659,\n \"acc_stderr\": 0.013504357787494042\n }\n}\n```", "repo_url": "https://huggingface.co/gagan3012/MetaModel_moe_multilingualv1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|arc:challenge|25_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|arc:challenge|25_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|gsm8k|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|gsm8k|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hellaswag|10_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hellaswag|10_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T02-34-11.693561.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T03-30-47.395820.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["**/details_harness|winogrande|5_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["**/details_harness|winogrande|5_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-08T03-30-47.395820.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_08T02_34_11.693561", "path": ["results_2024-01-08T02-34-11.693561.parquet"]}, {"split": "2024_01_08T03_30_47.395820", "path": ["results_2024-01-08T03-30-47.395820.parquet"]}, {"split": "latest", "path": ["results_2024-01-08T03-30-47.395820.parquet"]}]}]} | 2024-01-08T03:33:30+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of gagan3012/MetaModel_moe_multilingualv1
Dataset automatically created during the evaluation run of model gagan3012/MetaModel_moe_multilingualv1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-08T03:30:47.395820(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of gagan3012/MetaModel_moe_multilingualv1\n\n\n\nDataset automatically created during the evaluation run of model gagan3012/MetaModel_moe_multilingualv1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T03:30:47.395820(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of gagan3012/MetaModel_moe_multilingualv1\n\n\n\nDataset automatically created during the evaluation run of model gagan3012/MetaModel_moe_multilingualv1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T03:30:47.395820(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
69,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of gagan3012/MetaModel_moe_multilingualv1\n\n\n\nDataset automatically created during the evaluation run of model gagan3012/MetaModel_moe_multilingualv1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-08T03:30:47.395820(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
2230ec7c3820d4d9a14708e8b3f379748667127d |
The `oreg` and the `odat` refer to the original registration number and date respectively of the copyright entry and thats what we try to match with the Copyright Catalog. `rdat` is the renewal date. The other columns are pretty self-explanatory. | baber/cce-renewals | [
"license:cc0-1.0",
"region:us"
] | 2024-01-08T02:37:17+00:00 | {"license": "cc0-1.0", "configs": [{"config_name": "full", "data_files": "renewals_full.parquet"}, {"config_name": "unmatched", "data_files": "ren_unmatched.parquet"}, {"config_name": "matched", "data_files": "matched.parquet"}]} | 2024-02-05T12:58:04+00:00 | [] | [] | TAGS
#license-cc0-1.0 #region-us
|
The 'oreg' and the 'odat' refer to the original registration number and date respectively of the copyright entry and thats what we try to match with the Copyright Catalog. 'rdat' is the renewal date. The other columns are pretty self-explanatory. | [] | [
"TAGS\n#license-cc0-1.0 #region-us \n"
] | [
14
] | [
"passage: TAGS\n#license-cc0-1.0 #region-us \n"
] |
3ace0c6d0d510d1b00cbd4fa3e37e2cfaf2c731e |
⚠️ Only for research purpose. 研究目的利用に限定.
# JJSIMQA : 日本内科学会雑誌の付録確認問題
日本内科学会雑誌107巻〜111巻に掲載されているMultiple Choice Questions(各特集に対する5択の確認問題)全460件を人手で書き起こしました.
医療言語モデルの性能評価用の利用を想定しています.
[JMedLoRA](https://arxiv.org/abs/2310.10083)の実験にて利用.
### 注意
problem_id 107_64-4は, 1つ選ばせる問題ですが, 5つ目の選択肢が「(a)〜(d)の全て」となっているため毛色が異なります.
評価用データセットとして用いる場合には除外推奨.
### How to cite
本データを利用する場合は以下の文献の引用をご検討ください.
```
@article{sukeda2023jmedlora,
title={{JMedLoRA: Medical Domain Adaptation on Japanese Large Language Models using Instruction-tuning}},
author={Sukeda, Issey and Suzuki, Masahiro and Sakaji, Hiroki and Kodera, Satoshi},
journal={arXiv preprint arXiv:2310.10083},
year={2023}
}
```
| AIgroup-CVM-utokyohospital/JJSIMQA | [
"license:cc-by-nc-sa-4.0",
"arxiv:2310.10083",
"region:us"
] | 2024-01-08T03:08:16+00:00 | {"license": "cc-by-nc-sa-4.0"} | 2024-01-08T06:21:39+00:00 | [
"2310.10083"
] | [] | TAGS
#license-cc-by-nc-sa-4.0 #arxiv-2310.10083 #region-us
|
️ Only for research purpose. 研究目的利用に限定.
# JJSIMQA : 日本内科学会雑誌の付録確認問題
日本内科学会雑誌107巻〜111巻に掲載されているMultiple Choice Questions(各特集に対する5択の確認問題)全460件を人手で書き起こしました.
医療言語モデルの性能評価用の利用を想定しています.
JMedLoRAの実験にて利用.
### 注意
problem_id 107_64-4は, 1つ選ばせる問題ですが, 5つ目の選択肢が「(a)〜(d)の全て」となっているため毛色が異なります.
評価用データセットとして用いる場合には除外推奨.
### How to cite
本データを利用する場合は以下の文献の引用をご検討ください.
| [
"# JJSIMQA : 日本内科学会雑誌の付録確認問題\n\n日本内科学会雑誌107巻〜111巻に掲載されているMultiple Choice Questions(各特集に対する5択の確認問題)全460件を人手で書き起こしました.\n医療言語モデルの性能評価用の利用を想定しています.\n\nJMedLoRAの実験にて利用.",
"### 注意\n\nproblem_id 107_64-4は, 1つ選ばせる問題ですが, 5つ目の選択肢が「(a)〜(d)の全て」となっているため毛色が異なります. \n\n評価用データセットとして用いる場合には除外推奨.",
"### How to cite\n\n本データを利用する場合は以下の文献の引用をご検討ください."
] | [
"TAGS\n#license-cc-by-nc-sa-4.0 #arxiv-2310.10083 #region-us \n",
"# JJSIMQA : 日本内科学会雑誌の付録確認問題\n\n日本内科学会雑誌107巻〜111巻に掲載されているMultiple Choice Questions(各特集に対する5択の確認問題)全460件を人手で書き起こしました.\n医療言語モデルの性能評価用の利用を想定しています.\n\nJMedLoRAの実験にて利用.",
"### 注意\n\nproblem_id 107_64-4は, 1つ選ばせる問題ですが, 5つ目の選択肢が「(a)〜(d)の全て」となっているため毛色が異なります. \n\n評価用データセットとして用いる場合には除外推奨.",
"### How to cite\n\n本データを利用する場合は以下の文献の引用をご検討ください."
] | [
28,
78,
57,
17
] | [
"passage: TAGS\n#license-cc-by-nc-sa-4.0 #arxiv-2310.10083 #region-us \n# JJSIMQA : 日本内科学会雑誌の付録確認問題\n\n日本内科学会雑誌107巻〜111巻に掲載されているMultiple Choice Questions(各特集に対する5択の確認問題)全460件を人手で書き起こしました.\n医療言語モデルの性能評価用の利用を想定しています.\n\nJMedLoRAの実験にて利用.### 注意\n\nproblem_id 107_64-4は, 1つ選ばせる問題ですが, 5つ目の選択肢が「(a)〜(d)の全て」となっているため毛色が異なります. \n\n評価用データセットとして用いる場合には除外推奨.### How to cite\n\n本データを利用する場合は以下の文献の引用をご検討ください."
] |
622395f286a060de1c6e382fccdac3f3e13d96aa |
# Dataset Card for Evaluation run of migtissera/SynthIA-70B-v1.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [migtissera/SynthIA-70B-v1.5](https://huggingface.co/migtissera/SynthIA-70B-v1.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__SynthIA-70B-v1.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T03:41:45.134305](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__SynthIA-70B-v1.5/blob/main/results_2024-01-08T03-41-45.134305.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6908264640585592,
"acc_stderr": 0.03059066166329622,
"acc_norm": 0.6946757184916225,
"acc_norm_stderr": 0.031189654753338784,
"mc1": 0.40758873929008566,
"mc1_stderr": 0.017201949234553107,
"mc2": 0.5740125787335826,
"mc2_stderr": 0.015069171807224776
},
"harness|arc:challenge|25": {
"acc": 0.6518771331058021,
"acc_stderr": 0.013921008595179349,
"acc_norm": 0.6936860068259386,
"acc_norm_stderr": 0.013470584417276513
},
"harness|hellaswag|10": {
"acc": 0.6813383788090022,
"acc_stderr": 0.0046500521500943935,
"acc_norm": 0.8697470623381797,
"acc_norm_stderr": 0.00335893627986726
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7960526315789473,
"acc_stderr": 0.03279000406310051,
"acc_norm": 0.7960526315789473,
"acc_norm_stderr": 0.03279000406310051
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663434,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663434
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.676595744680851,
"acc_stderr": 0.030579442773610337,
"acc_norm": 0.676595744680851,
"acc_norm_stderr": 0.030579442773610337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474894,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474894
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.021417242936321582,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.021417242936321582
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.021938047738853106,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.021938047738853106
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7307692307692307,
"acc_stderr": 0.022489389793654817,
"acc_norm": 0.7307692307692307,
"acc_norm_stderr": 0.022489389793654817
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.02720537153827947,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.02720537153827947
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4503311258278146,
"acc_stderr": 0.04062290018683775,
"acc_norm": 0.4503311258278146,
"acc_norm_stderr": 0.04062290018683775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8990825688073395,
"acc_stderr": 0.012914673545364432,
"acc_norm": 0.8990825688073395,
"acc_norm_stderr": 0.012914673545364432
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6157407407407407,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.6157407407407407,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8872549019607843,
"acc_stderr": 0.02219857103945679,
"acc_norm": 0.8872549019607843,
"acc_norm_stderr": 0.02219857103945679
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.020681745135884562,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.020681745135884562
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519517,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519517
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.032484700838071943,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.032484700838071943
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.018724301741941646,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.018724301741941646
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8620689655172413,
"acc_stderr": 0.012331009307795656,
"acc_norm": 0.8620689655172413,
"acc_norm_stderr": 0.012331009307795656
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7745664739884393,
"acc_stderr": 0.022497230190967558,
"acc_norm": 0.7745664739884393,
"acc_norm_stderr": 0.022497230190967558
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5743016759776536,
"acc_stderr": 0.016536829648997116,
"acc_norm": 0.5743016759776536,
"acc_norm_stderr": 0.016536829648997116
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02392915551735129,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02392915551735129
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.808641975308642,
"acc_stderr": 0.021887704613396147,
"acc_norm": 0.808641975308642,
"acc_norm_stderr": 0.021887704613396147
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5430247718383312,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.5430247718383312,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02576725201085596,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02576725201085596
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.017401816711427646,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.017401816711427646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.04172343038705382,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.04172343038705382
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8040816326530612,
"acc_stderr": 0.02540930195322568,
"acc_norm": 0.8040816326530612,
"acc_norm_stderr": 0.02540930195322568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061452,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061452
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40758873929008566,
"mc1_stderr": 0.017201949234553107,
"mc2": 0.5740125787335826,
"mc2_stderr": 0.015069171807224776
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273764
},
"harness|gsm8k|5": {
"acc": 0.5481425322213799,
"acc_stderr": 0.01370849499567764
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_migtissera__SynthIA-70B-v1.5 | [
"region:us"
] | 2024-01-08T03:43:37+00:00 | {"pretty_name": "Evaluation run of migtissera/SynthIA-70B-v1.5", "dataset_summary": "Dataset automatically created during the evaluation run of model [migtissera/SynthIA-70B-v1.5](https://huggingface.co/migtissera/SynthIA-70B-v1.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__SynthIA-70B-v1.5\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-08T03:41:45.134305](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__SynthIA-70B-v1.5/blob/main/results_2024-01-08T03-41-45.134305.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6908264640585592,\n \"acc_stderr\": 0.03059066166329622,\n \"acc_norm\": 0.6946757184916225,\n \"acc_norm_stderr\": 0.031189654753338784,\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.017201949234553107,\n \"mc2\": 0.5740125787335826,\n \"mc2_stderr\": 0.015069171807224776\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6518771331058021,\n \"acc_stderr\": 0.013921008595179349,\n \"acc_norm\": 0.6936860068259386,\n \"acc_norm_stderr\": 0.013470584417276513\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6813383788090022,\n \"acc_stderr\": 0.0046500521500943935,\n \"acc_norm\": 0.8697470623381797,\n \"acc_norm_stderr\": 0.00335893627986726\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.03279000406310051,\n \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.03279000406310051\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695248,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695248\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663434,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663434\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.676595744680851,\n \"acc_stderr\": 0.030579442773610337,\n \"acc_norm\": 0.676595744680851,\n \"acc_norm_stderr\": 0.030579442773610337\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474894,\n \"acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474894\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n \"acc_stderr\": 0.021417242936321582,\n \"acc_norm\": 0.8290322580645161,\n \"acc_norm_stderr\": 0.021417242936321582\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853106,\n \"acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853106\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7307692307692307,\n \"acc_stderr\": 0.022489389793654817,\n \"acc_norm\": 0.7307692307692307,\n \"acc_norm_stderr\": 0.022489389793654817\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.02720537153827947,\n \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.02720537153827947\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4503311258278146,\n \"acc_stderr\": 0.04062290018683775,\n \"acc_norm\": 0.4503311258278146,\n \"acc_norm_stderr\": 0.04062290018683775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8990825688073395,\n \"acc_stderr\": 0.012914673545364432,\n \"acc_norm\": 0.8990825688073395,\n \"acc_norm_stderr\": 0.012914673545364432\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6157407407407407,\n \"acc_stderr\": 0.03317354514310742,\n \"acc_norm\": 0.6157407407407407,\n \"acc_norm_stderr\": 0.03317354514310742\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8872549019607843,\n \"acc_stderr\": 0.02219857103945679,\n \"acc_norm\": 0.8872549019607843,\n \"acc_norm_stderr\": 0.02219857103945679\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884562,\n \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884562\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n \"acc_stderr\": 0.027991534258519517,\n \"acc_norm\": 0.7757847533632287,\n \"acc_norm_stderr\": 0.027991534258519517\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8512396694214877,\n \"acc_stderr\": 0.032484700838071943,\n \"acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.032484700838071943\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n \"acc_stderr\": 0.018724301741941646,\n \"acc_norm\": 0.9102564102564102,\n \"acc_norm_stderr\": 0.018724301741941646\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8620689655172413,\n \"acc_stderr\": 0.012331009307795656,\n \"acc_norm\": 0.8620689655172413,\n \"acc_norm_stderr\": 0.012331009307795656\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7745664739884393,\n \"acc_stderr\": 0.022497230190967558,\n \"acc_norm\": 0.7745664739884393,\n \"acc_norm_stderr\": 0.022497230190967558\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5743016759776536,\n \"acc_stderr\": 0.016536829648997116,\n \"acc_norm\": 0.5743016759776536,\n \"acc_norm_stderr\": 0.016536829648997116\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.02392915551735129,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02392915551735129\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.808641975308642,\n \"acc_stderr\": 0.021887704613396147,\n \"acc_norm\": 0.808641975308642,\n \"acc_norm_stderr\": 0.021887704613396147\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766006,\n \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766006\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5430247718383312,\n \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.5430247718383312,\n \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02576725201085596,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02576725201085596\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.017401816711427646,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.017401816711427646\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.04172343038705382,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.04172343038705382\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.02540930195322568,\n \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.02540930195322568\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061452,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061452\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.017201949234553107,\n \"mc2\": 0.5740125787335826,\n \"mc2_stderr\": 0.015069171807224776\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273764\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5481425322213799,\n \"acc_stderr\": 0.01370849499567764\n }\n}\n```", "repo_url": "https://huggingface.co/migtissera/SynthIA-70B-v1.5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|arc:challenge|25_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|gsm8k|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hellaswag|10_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T03-41-45.134305.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["**/details_harness|winogrande|5_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-08T03-41-45.134305.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_08T03_41_45.134305", "path": ["results_2024-01-08T03-41-45.134305.parquet"]}, {"split": "latest", "path": ["results_2024-01-08T03-41-45.134305.parquet"]}]}]} | 2024-01-08T03:43:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of migtissera/SynthIA-70B-v1.5
Dataset automatically created during the evaluation run of model migtissera/SynthIA-70B-v1.5 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-08T03:41:45.134305(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of migtissera/SynthIA-70B-v1.5\n\n\n\nDataset automatically created during the evaluation run of model migtissera/SynthIA-70B-v1.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T03:41:45.134305(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of migtissera/SynthIA-70B-v1.5\n\n\n\nDataset automatically created during the evaluation run of model migtissera/SynthIA-70B-v1.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T03:41:45.134305(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of migtissera/SynthIA-70B-v1.5\n\n\n\nDataset automatically created during the evaluation run of model migtissera/SynthIA-70B-v1.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-08T03:41:45.134305(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
ed7aeb1bb06c2aefad268e687ee49b5dbc1f7999 |
# Dataset Card for Evaluation run of GOAT-AI/GOAT-70B-Storytelling
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [GOAT-AI/GOAT-70B-Storytelling](https://huggingface.co/GOAT-AI/GOAT-70B-Storytelling) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_GOAT-AI__GOAT-70B-Storytelling",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T04:02:16.743914](https://huggingface.co/datasets/open-llm-leaderboard/details_GOAT-AI__GOAT-70B-Storytelling/blob/main/results_2024-01-08T04-02-16.743914.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6955334014859299,
"acc_stderr": 0.03022110624122134,
"acc_norm": 0.7020604921664385,
"acc_norm_stderr": 0.030808489808640836,
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163495,
"mc2": 0.535286285114223,
"mc2_stderr": 0.014750619695125833
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620194,
"acc_norm": 0.6877133105802048,
"acc_norm_stderr": 0.013542598541688065
},
"harness|hellaswag|10": {
"acc": 0.6848237402907787,
"acc_stderr": 0.004636365534819762,
"acc_norm": 0.877414857598088,
"acc_norm_stderr": 0.0032729014349397656
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.03078373675774564,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.03078373675774564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044912,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044912
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.0255064816981382,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.0255064816981382
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329283,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329283
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880236,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880236
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9481865284974094,
"acc_stderr": 0.01599622932024412,
"acc_norm": 0.9481865284974094,
"acc_norm_stderr": 0.01599622932024412
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7205128205128205,
"acc_stderr": 0.022752388839776823,
"acc_norm": 0.7205128205128205,
"acc_norm_stderr": 0.022752388839776823
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.02904560029061626,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.02904560029061626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02755361446786381,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02755361446786381
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.040428099613956346,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.040428099613956346
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8954128440366973,
"acc_stderr": 0.013120530245265593,
"acc_norm": 0.8954128440366973,
"acc_norm_stderr": 0.013120530245265593
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.021644195727955173,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.021644195727955173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8071748878923767,
"acc_stderr": 0.026478240960489365,
"acc_norm": 0.8071748878923767,
"acc_norm_stderr": 0.026478240960489365
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.03092278832044579,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.03092278832044579
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.03044677768797173,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.03044677768797173
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719097,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719097
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.01911989279892498,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.01911989279892498
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8646232439335888,
"acc_stderr": 0.012234384586856491,
"acc_norm": 0.8646232439335888,
"acc_norm_stderr": 0.012234384586856491
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7774566473988439,
"acc_stderr": 0.022394215661942815,
"acc_norm": 0.7774566473988439,
"acc_norm_stderr": 0.022394215661942815
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45251396648044695,
"acc_stderr": 0.016646914804438775,
"acc_norm": 0.45251396648044695,
"acc_norm_stderr": 0.016646914804438775
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7875816993464052,
"acc_stderr": 0.023420375478296125,
"acc_norm": 0.7875816993464052,
"acc_norm_stderr": 0.023420375478296125
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.020736358408060002,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.020736358408060002
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5638297872340425,
"acc_stderr": 0.029583452036284076,
"acc_norm": 0.5638297872340425,
"acc_norm_stderr": 0.029583452036284076
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5423728813559322,
"acc_stderr": 0.012724296550980188,
"acc_norm": 0.5423728813559322,
"acc_norm_stderr": 0.012724296550980188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.02667925227010314,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.02667925227010314
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.017282760695167404,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.017282760695167404
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8163265306122449,
"acc_stderr": 0.024789071332007633,
"acc_norm": 0.8163265306122449,
"acc_norm_stderr": 0.024789071332007633
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659393,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659393
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163495,
"mc2": 0.535286285114223,
"mc2_stderr": 0.014750619695125833
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.010430917468237428
},
"harness|gsm8k|5": {
"acc": 0.40788476118271416,
"acc_stderr": 0.013536742075643085
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_GOAT-AI__GOAT-70B-Storytelling | [
"region:us"
] | 2024-01-08T04:04:40+00:00 | {"pretty_name": "Evaluation run of GOAT-AI/GOAT-70B-Storytelling", "dataset_summary": "Dataset automatically created during the evaluation run of model [GOAT-AI/GOAT-70B-Storytelling](https://huggingface.co/GOAT-AI/GOAT-70B-Storytelling) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_GOAT-AI__GOAT-70B-Storytelling\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-08T04:02:16.743914](https://huggingface.co/datasets/open-llm-leaderboard/details_GOAT-AI__GOAT-70B-Storytelling/blob/main/results_2024-01-08T04-02-16.743914.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6955334014859299,\n \"acc_stderr\": 0.03022110624122134,\n \"acc_norm\": 0.7020604921664385,\n \"acc_norm_stderr\": 0.030808489808640836,\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.535286285114223,\n \"mc2_stderr\": 0.014750619695125833\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620194,\n \"acc_norm\": 0.6877133105802048,\n \"acc_norm_stderr\": 0.013542598541688065\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6848237402907787,\n \"acc_stderr\": 0.004636365534819762,\n \"acc_norm\": 0.877414857598088,\n \"acc_norm_stderr\": 0.0032729014349397656\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.03078373675774564,\n \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.03078373675774564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.046774730044912,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.046774730044912\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.0255064816981382,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.0255064816981382\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n \"acc_stderr\": 0.021732540689329283,\n \"acc_norm\": 0.8225806451612904,\n \"acc_norm_stderr\": 0.021732540689329283\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880236,\n \"acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880236\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7205128205128205,\n \"acc_stderr\": 0.022752388839776823,\n \"acc_norm\": 0.7205128205128205,\n \"acc_norm_stderr\": 0.022752388839776823\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.02904560029061626,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.02904560029061626\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02755361446786381,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02755361446786381\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4304635761589404,\n \"acc_stderr\": 0.040428099613956346,\n \"acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.040428099613956346\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8954128440366973,\n \"acc_stderr\": 0.013120530245265593,\n \"acc_norm\": 0.8954128440366973,\n \"acc_norm_stderr\": 0.013120530245265593\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8734177215189873,\n \"acc_stderr\": 0.021644195727955173,\n \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.021644195727955173\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.03092278832044579,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.03092278832044579\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.03044677768797173,\n \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.03044677768797173\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n \"acc_stderr\": 0.04726835553719097,\n \"acc_norm\": 0.5446428571428571,\n \"acc_norm_stderr\": 0.04726835553719097\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n \"acc_stderr\": 0.01911989279892498,\n \"acc_norm\": 0.905982905982906,\n \"acc_norm_stderr\": 0.01911989279892498\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8646232439335888,\n \"acc_stderr\": 0.012234384586856491,\n \"acc_norm\": 0.8646232439335888,\n \"acc_norm_stderr\": 0.012234384586856491\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7774566473988439,\n \"acc_stderr\": 0.022394215661942815,\n \"acc_norm\": 0.7774566473988439,\n \"acc_norm_stderr\": 0.022394215661942815\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45251396648044695,\n \"acc_stderr\": 0.016646914804438775,\n \"acc_norm\": 0.45251396648044695,\n \"acc_norm_stderr\": 0.016646914804438775\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7875816993464052,\n \"acc_stderr\": 0.023420375478296125,\n \"acc_norm\": 0.7875816993464052,\n \"acc_norm_stderr\": 0.023420375478296125\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.020736358408060002,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.020736358408060002\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5638297872340425,\n \"acc_stderr\": 0.029583452036284076,\n \"acc_norm\": 0.5638297872340425,\n \"acc_norm_stderr\": 0.029583452036284076\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5423728813559322,\n \"acc_stderr\": 0.012724296550980188,\n \"acc_norm\": 0.5423728813559322,\n \"acc_norm_stderr\": 0.012724296550980188\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.02667925227010314,\n \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.02667925227010314\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.017282760695167404,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.017282760695167404\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8163265306122449,\n \"acc_stderr\": 0.024789071332007633,\n \"acc_norm\": 0.8163265306122449,\n \"acc_norm_stderr\": 0.024789071332007633\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.021166216304659393,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.021166216304659393\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.535286285114223,\n \"mc2_stderr\": 0.014750619695125833\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237428\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40788476118271416,\n \"acc_stderr\": 0.013536742075643085\n }\n}\n```", "repo_url": "https://huggingface.co/GOAT-AI/GOAT-70B-Storytelling", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|arc:challenge|25_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|gsm8k|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hellaswag|10_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-08T04-02-16.743914.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["**/details_harness|winogrande|5_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-08T04-02-16.743914.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_08T04_02_16.743914", "path": ["results_2024-01-08T04-02-16.743914.parquet"]}, {"split": "latest", "path": ["results_2024-01-08T04-02-16.743914.parquet"]}]}]} | 2024-01-08T04:05:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of GOAT-AI/GOAT-70B-Storytelling
Dataset automatically created during the evaluation run of model GOAT-AI/GOAT-70B-Storytelling on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-08T04:02:16.743914(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of GOAT-AI/GOAT-70B-Storytelling\n\n\n\nDataset automatically created during the evaluation run of model GOAT-AI/GOAT-70B-Storytelling on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T04:02:16.743914(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of GOAT-AI/GOAT-70B-Storytelling\n\n\n\nDataset automatically created during the evaluation run of model GOAT-AI/GOAT-70B-Storytelling on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-08T04:02:16.743914(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of GOAT-AI/GOAT-70B-Storytelling\n\n\n\nDataset automatically created during the evaluation run of model GOAT-AI/GOAT-70B-Storytelling on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-08T04:02:16.743914(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
78aa3747b7d65ec84486de39d3b644d7f863aa7b |
<p align="center">
<img src="https://s11.ax1x.com/2023/12/28/piqvDMV.png" width="250" style="margin-bottom: 0.2;"/>
<p>
<h2 align="center"> <a href="https://arxiv.org/abs/2401.15947">MoE-LLaVA: Mixture of Experts for Large Vision-Language Models</a></h2>
<h5 align="center"> If you like our project, please give us a star ⭐ on GitHub for latest update. </h2>
<h5 align="center">
</h5>
## 📰 News
* **[2024.01.30]** The [paper](https://arxiv.org/abs/2401.15947) is released.
* **[2024.01.27]** 🤗[Hugging Face demo](https://huggingface.co/spaces/LanguageBind/MoE-LLaVA) and **all codes & datasets** are available now! Welcome to **watch** 👀 this repository for the latest updates.
## 😮 Highlights
MoE-LLaVA shows excellent performance in multi-modal learning.
### 🔥 High performance, but with fewer parameters
- with just **3B sparsely activated parameters**, MoE-LLaVA demonstrates performance comparable to the LLaVA-1.5-7B on various visual understanding datasets and even surpasses the LLaVA-1.5-13B in object hallucination benchmarks.
### 🚀 Simple baseline, learning multi-modal interactions with sparse pathways.
- With the addition of **a simple MoE tuning stage**, we can complete the training of MoE-LLaVA on **8 V100 GPUs** within 2 days.
## 🤗 Demo
### Gradio Web UI
Highly recommend trying out our web demo by the following command, which incorporates all features currently supported by MoE-LLaVA. We also provide [online demo](https://huggingface.co/spaces/LanguageBind/MoE-LLaVA) in Huggingface Spaces.
```bash
# use phi2
deepspeed --include localhost:0 moellava/serve/gradio_web_server.py --model-path "LanguageBind/MoE-LLaVA-Phi2-2.7B-4e"
# use qwen
deepspeed --include localhost:0 moellava/serve/gradio_web_server.py --model-path "LanguageBind/MoE-LLaVA-Qwen-1.8B-4e"
# use stablelm
deepspeed --include localhost:0 moellava/serve/gradio_web_server.py --model-path "LanguageBind/MoE-LLaVA-StableLM-1.6B-4e"
```
### CLI Inference
```bash
# use phi2
deepspeed --include localhost:0 moellava/serve/cli.py --model-path "LanguageBind/MoE-LLaVA-Phi2-2.7B-4e" --image-file "image.jpg"
# use qwen
deepspeed --include localhost:0 moellava/serve/cli.py --model-path "LanguageBind/MoE-LLaVA-Qwen-1.8B-4e" --image-file "image.jpg"
# use stablelm
deepspeed --include localhost:0 moellava/serve/cli.py --model-path "LanguageBind/MoE-LLaVA-StableLM-1.6B-4e" --image-file "image.jpg"
```
## 🐳 Model Zoo
| Model | LLM | Checkpoint | Avg | VQAv2 | GQA | VizWiz | SQA | T-VQA | POPE | MM-Bench| LLaVA-Bench-Wild | MM-Vet |
|----------|-----------|-----------|---|---|---|---|---|---|---|---|---|---|
| MoE-LLaVA-1.6B×4-Top2 | 1.6B | [LanguageBind/MoE-LLaVA-StableLM-1.6B-4e](https://huggingface.co/LanguageBind/MoE-LLaVA-StableLM-1.6B-4e) | 60.0 | 76.0 | 60.4 | 37.2 | 62.6 | 47.8 | 84.3 | 59.4 | 85.9 | 26.1 |
| MoE-LLaVA-1.8B×4-Top2 | 1.8B | [LanguageBind/MoE-LLaVA-Qwen-1.8B-4e](https://huggingface.co/LanguageBind/MoE-LLaVA-Qwen-1.8B-4e) | 60.2 | 76.2 | 61.5 | 32.6 | 63.1 | 48.0 | 87.0 | 59.6 | 88.7 | 25.3 |
| MoE-LLaVA-2.7B×4-Top2 | 2.7B | [LanguageBind/MoE-LLaVA-Phi2-2.7B-4e](https://huggingface.co/LanguageBind/MoE-LLaVA-Phi2-2.7B-4e) | 63.9 | 77.1 | 61.1 | 43.4 | 68.7 | 50.2 | 85.0 | 65.5 | 93.2 | 31.1 |
<!--
| LLaVA-1.5 | 7B | [liuhaotian/llava-v1.5-7b](https://huggingface.co/liuhaotian/llava-v1.5-7b) | 62.0 | 78.5 | 62.0 | 50.0 | 66.8 | 58.2 | 85.9 | 64.3 | 31.1 |
| LLaVA-1.5 | 13B | [liuhaotian/llava-v1.5-13b](https://huggingface.co/liuhaotian/llava-v1.5-13b) | 64.9 | 80.0 | 63.3 | 53.6 | 71.6 | 61.3 | 85.9 | 67.7 | 36.1 |
-->
## ⚙️ Requirements and Installation
* Python >= 3.10
* Pytorch == 2.0.1
* CUDA Version >= 11.7
* **Transformers == 4.36.2**
* **Tokenizers==0.15.1**
* Install required packages:
```bash
git clone https://github.com/PKU-YuanGroup/MoE-LLaVA
cd MoE-LLaVA
conda create -n moellava python=3.10 -y
conda activate moellava
pip install --upgrade pip # enable PEP 660 support
pip install -e .
pip install -e ".[train]"
pip install flash-attn --no-build-isolation
# Below are optional. For Qwen model.
git clone https://github.com/Dao-AILab/flash-attention
cd flash-attention && pip install .
# Below are optional. Installing them might be slow.
# pip install csrc/layer_norm
# If the version of flash-attn is higher than 2.1.1, the following is not needed.
# pip install csrc/rotary
```
## 🗝️ Training & Validating
The training & validating instruction is in [TRAIN.md](docs/TRAIN.md) & [EVAL.md](docs/EVAL.md).
## 💡 Customizing your MoE-LLaVA
The instruction is in [CUSTOM.md](docs/CUSTOM.md).
## 😍 Visualization
The instruction is in [VISUALIZATION.md](docs/VISUALIZATION.md).
## 🤖 API
**We open source all codes.** If you want to load the model (e.g. ```LanguageBind/MoE-LLaVA```) on local, you can use the following code snippets.
**Using the following command to run the code.**
```bash
deepspeed predict.py
```
```python
import torch
from moellava.constants import IMAGE_TOKEN_INDEX, DEFAULT_IMAGE_TOKEN
from moellava.conversation import conv_templates, SeparatorStyle
from moellava.model.builder import load_pretrained_model
from moellava.utils import disable_torch_init
from moellava.mm_utils import tokenizer_image_token, get_model_name_from_path, KeywordsStoppingCriteria
def main():
disable_torch_init()
image = 'moellava/serve/examples/extreme_ironing.jpg'
inp = 'What is unusual about this image?'
model_path = 'LanguageBind/MoE-LLaVA-Phi2-2.7B-4e' # LanguageBind/MoE-LLaVA-Qwen-1.8B-4e or LanguageBind/MoE-LLaVA-StableLM-1.6B-4e
device = 'cuda'
load_4bit, load_8bit = False, False # FIXME: Deepspeed support 4bit or 8bit?
model_name = get_model_name_from_path(model_path)
tokenizer, model, processor, context_len = load_pretrained_model(model_path, None, model_name, load_8bit, load_4bit, device=device)
image_processor = processor['image']
conv_mode = "phi" # qwen or stablelm
conv = conv_templates[conv_mode].copy()
roles = conv.roles
image_tensor = image_processor.preprocess(image, return_tensors='pt')['pixel_values'].to(model.device, dtype=torch.float16)
print(f"{roles[1]}: {inp}")
inp = DEFAULT_IMAGE_TOKEN + '\n' + inp
conv.append_message(conv.roles[0], inp)
conv.append_message(conv.roles[1], None)
prompt = conv.get_prompt()
input_ids = tokenizer_image_token(prompt, tokenizer, IMAGE_TOKEN_INDEX, return_tensors='pt').unsqueeze(0).cuda()
stop_str = conv.sep if conv.sep_style != SeparatorStyle.TWO else conv.sep2
keywords = [stop_str]
stopping_criteria = KeywordsStoppingCriteria(keywords, tokenizer, input_ids)
with torch.inference_mode():
output_ids = model.generate(
input_ids,
images=image_tensor,
do_sample=True,
temperature=0.2,
max_new_tokens=1024,
use_cache=True,
stopping_criteria=[stopping_criteria])
outputs = tokenizer.decode(output_ids[0, input_ids.shape[1]:], skip_special_tokens=True).strip()
print(outputs)
if __name__ == '__main__':
main()
```
## 🙌 Related Projects
* [Video-LLaVA](https://github.com/PKU-YuanGroup/Video-LLaVA) This framework empowers the model to efficiently utilize the united visual tokens.
* [LanguageBind](https://github.com/PKU-YuanGroup/LanguageBind) An open source five modalities language-based retrieval framework.
## 👍 Acknowledgement
* [LLaVA](https://github.com/haotian-liu/LLaVA) The codebase we built upon and it is an efficient large language and vision assistant.
## 🔒 License
* The majority of this project is released under the Apache 2.0 license as found in the [LICENSE](https://github.com/PKU-YuanGroup/MoE-LLaVA/blob/main/LICENSE) file.
* The service is a research preview intended for non-commercial use only, subject to the model [License](https://github.com/facebookresearch/llama/blob/main/MODEL_CARD.md) of LLaMA, [Terms of Use](https://openai.com/policies/terms-of-use) of the data generated by OpenAI, and [Privacy Practices](https://chrome.google.com/webstore/detail/sharegpt-share-your-chatg/daiacboceoaocpibfodeljbdfacokfjb) of ShareGPT. Please contact us if you find any potential violation.
## ✏️ Citation
If you find our paper and code useful in your research, please consider giving a star :star: and citation :pencil:.
```BibTeX
@misc{lin2024moellava,
title={MoE-LLaVA: Mixture of Experts for Large Vision-Language Models},
author={Bin Lin and Zhenyu Tang and Yang Ye and Jiaxi Cui and Bin Zhu and Peng Jin and Junwu Zhang and Munan Ning and Li Yuan},
year={2024},
eprint={2401.15947},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
```BibTeX
@article{lin2023video,
title={Video-LLaVA: Learning United Visual Representation by Alignment Before Projection},
author={Lin, Bin and Zhu, Bin and Ye, Yang and Ning, Munan and Jin, Peng and Yuan, Li},
journal={arXiv preprint arXiv:2311.10122},
year={2023}
}
```
## ✨ Star History
[](https://star-history.com/#PKU-YuanGroup/MoE-LLaVA&Date)
## 🤝 Contributors
<a href="https://github.com/PKU-YuanGroup/MoE-LLaVA/graphs/contributors">
<img src="https://contrib.rocks/image?repo=PKU-YuanGroup/MoE-LLaVA" />
</a>
| LanguageBind/MoE-LLaVA | [
"license:apache-2.0",
"arxiv:2401.15947",
"region:us"
] | 2024-01-08T04:43:17+00:00 | {"license": "apache-2.0"} | 2024-02-15T05:16:14+00:00 | [
"2401.15947"
] | [] | TAGS
#license-apache-2.0 #arxiv-2401.15947 #region-us
|

[If you like our project, please give us a star ⭐ on GitHub for latest update.](URL Mixture of Experts for Large Vision-Language Models</a></h2>
<h5 align=)
--------------------------------------------------------------------------------------------------------------------------------------------------------------
#####
News
----
* [2024.01.30] The paper is released.
* [2024.01.27] Hugging Face demo and all codes & datasets are available now! Welcome to watch this repository for the latest updates.
Highlights
----------
MoE-LLaVA shows excellent performance in multi-modal learning.
### High performance, but with fewer parameters
* with just 3B sparsely activated parameters, MoE-LLaVA demonstrates performance comparable to the LLaVA-1.5-7B on various visual understanding datasets and even surpasses the LLaVA-1.5-13B in object hallucination benchmarks.
### Simple baseline, learning multi-modal interactions with sparse pathways.
* With the addition of a simple MoE tuning stage, we can complete the training of MoE-LLaVA on 8 V100 GPUs within 2 days.
Demo
----
### Gradio Web UI
Highly recommend trying out our web demo by the following command, which incorporates all features currently supported by MoE-LLaVA. We also provide online demo in Huggingface Spaces.
### CLI Inference
Model Zoo
---------
️ Requirements and Installation
-------------------------------
* Python >= 3.10
* Pytorch == 2.0.1
* CUDA Version >= 11.7
* Transformers == 4.36.2
* Tokenizers==0.15.1
* Install required packages:
️ Training & Validating
-----------------------
The training & validating instruction is in URL & URL.
Customizing your MoE-LLaVA
--------------------------
The instruction is in URL.
Visualization
-------------
The instruction is in URL.
API
---
We open source all codes. If you want to load the model (e.g. ) on local, you can use the following code snippets.
Using the following command to run the code.
Related Projects
----------------
* Video-LLaVA This framework empowers the model to efficiently utilize the united visual tokens.
* LanguageBind An open source five modalities language-based retrieval framework.
Acknowledgement
---------------
* LLaVA The codebase we built upon and it is an efficient large language and vision assistant.
License
-------
* The majority of this project is released under the Apache 2.0 license as found in the LICENSE file.
* The service is a research preview intended for non-commercial use only, subject to the model License of LLaMA, Terms of Use of the data generated by OpenAI, and Privacy Practices of ShareGPT. Please contact us if you find any potential violation.
️ Citation
----------
If you find our paper and code useful in your research, please consider giving a star :star: and citation :pencil:.
Star History
------------
 on local, you can use the following code snippets.\n\n\nUsing the following command to run the code.\n\n\nRelated Projects\n----------------\n\n\n* Video-LLaVA This framework empowers the model to efficiently utilize the united visual tokens.\n* LanguageBind An open source five modalities language-based retrieval framework.\n\n\nAcknowledgement\n---------------\n\n\n* LLaVA The codebase we built upon and it is an efficient large language and vision assistant.\n\n\nLicense\n-------\n\n\n* The majority of this project is released under the Apache 2.0 license as found in the LICENSE file.\n* The service is a research preview intended for non-commercial use only, subject to the model License of LLaMA, Terms of Use of the data generated by OpenAI, and Privacy Practices of ShareGPT. Please contact us if you find any potential violation.\n\n\n️ Citation\n----------\n\n\nIf you find our paper and code useful in your research, please consider giving a star :star: and citation :pencil:.\n\n\nStar History\n------------\n\n\n on local, you can use the following code snippets.\n\n\nUsing the following command to run the code.\n\n\nRelated Projects\n----------------\n\n\n* Video-LLaVA This framework empowers the model to efficiently utilize the united visual tokens.\n* LanguageBind An open source five modalities language-based retrieval framework.\n\n\nAcknowledgement\n---------------\n\n\n* LLaVA The codebase we built upon and it is an efficient large language and vision assistant.\n\n\nLicense\n-------\n\n\n* The majority of this project is released under the Apache 2.0 license as found in the LICENSE file.\n* The service is a research preview intended for non-commercial use only, subject to the model License of LLaMA, Terms of Use of the data generated by OpenAI, and Privacy Practices of ShareGPT. Please contact us if you find any potential violation.\n\n\n️ Citation\n----------\n\n\nIf you find our paper and code useful in your research, please consider giving a star :star: and citation :pencil:.\n\n\nStar History\n------------\n\n\n converted to work with axolotl completion or pretraining. | PJMixers/epfl-llm_guidelines_axolotl-completion | [
"task_categories:text-generation",
"size_categories:10K<n<100K",
"source_datasets:epfl-llm/guidelines",
"language:en",
"license:other",
"medical",
"health",
"region:us"
] | 2024-01-08T05:02:59+00:00 | {"language": ["en"], "license": "other", "size_categories": ["10K<n<100K"], "source_datasets": "epfl-llm/guidelines", "task_categories": ["text-generation"], "pretty_name": "Clinical Guidelines", "license_name": "common-crawl", "license_link": "LICENSE", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "train.jsonl"}]}], "tags": ["medical", "health"], "dataset_info": {"features": [{"name": "text", "dtype": "string"}]}} | 2024-01-08T05:11:06+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-10K<n<100K #source_datasets-epfl-llm/guidelines #language-English #license-other #medical #health #region-us
| epfl-llm/guidelines converted to work with axolotl completion or pretraining. | [] | [
"TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #source_datasets-epfl-llm/guidelines #language-English #license-other #medical #health #region-us \n"
] | [
58
] | [
"passage: TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #source_datasets-epfl-llm/guidelines #language-English #license-other #medical #health #region-us \n"
] |
56077beca7de956546022907d27bf1256c96cc11 | # Dataset Card for "ScienceEval"
The dataset is crawled from STEMEZ website under the science category, which contains mostly university-level science questions from math, biology, chemistry, computer science and physics. The crawling script can be found in https://github.com/wenhuchen/ScienceEval. | TIGER-Lab/ScienceEval | [
"region:us"
] | 2024-01-08T05:06:28+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "solution", "dtype": "string"}, {"name": "images", "dtype": "image"}, {"name": "image_question", "dtype": "bool"}, {"name": "short_answer", "dtype": "bool"}, {"name": "answer", "dtype": "string"}, {"name": "subject", "dtype": "string"}], "splits": [{"name": "science", "num_bytes": 7539892.752, "num_examples": 3052}, {"name": "math", "num_bytes": 22046721.537, "num_examples": 9839}], "download_size": 23462050, "dataset_size": 37126507.041}} | 2024-01-29T21:12:52+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ScienceEval"
The dataset is crawled from STEMEZ website under the science category, which contains mostly university-level science questions from math, biology, chemistry, computer science and physics. The crawling script can be found in URL | [
"# Dataset Card for \"ScienceEval\"\n\nThe dataset is crawled from STEMEZ website under the science category, which contains mostly university-level science questions from math, biology, chemistry, computer science and physics. The crawling script can be found in URL"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ScienceEval\"\n\nThe dataset is crawled from STEMEZ website under the science category, which contains mostly university-level science questions from math, biology, chemistry, computer science and physics. The crawling script can be found in URL"
] | [
6,
62
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"ScienceEval\"\n\nThe dataset is crawled from STEMEZ website under the science category, which contains mostly university-level science questions from math, biology, chemistry, computer science and physics. The crawling script can be found in URL"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.